sha
null
last_modified
null
library_name
stringclasses
154 values
text
stringlengths
1
900k
metadata
stringlengths
2
348k
pipeline_tag
stringclasses
45 values
id
stringlengths
5
122
tags
listlengths
1
1.84k
created_at
stringlengths
25
25
arxiv
listlengths
0
201
languages
listlengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
listlengths
0
722
processed_texts
listlengths
1
723
tokens_length
listlengths
1
723
input_texts
listlengths
1
61
embeddings
listlengths
768
768
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1333105810851975169/duOCN2P4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">David desJardins 🤖 AI Bot </div> <div style="font-size: 15px">@david_desj bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@david_desj's tweets](https://twitter.com/david_desj). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 41 | | Short tweets | 23 | | Tweets kept | 3186 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/391ogow1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @david_desj's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2qeggjsp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2qeggjsp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/david_desj') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/david_desj/1616642206560/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/david_desj
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
David desJardins AI Bot @david\_desj bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @david\_desj's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @david\_desj's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/855423041161105408/f8QTAXnm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Daviz 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@david_rccv bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@david_rccv's tweets](https://twitter.com/david_rccv). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3117</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1335</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>187</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1595</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2u4p05g9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @david_rccv's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2njpujl0) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2njpujl0/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/david_rccv'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/david_rccv/1602336344954/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/david_rccv
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Daviz AI Bot </div> <div style="font-size: 15px; color: #657786">@david_rccv bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @david_rccv's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3117</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1335</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>187</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1595</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @david_rccv's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/david_rccv'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @david_rccv's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3117</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1335</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>187</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1595</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @david_rccv's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/david_rccv'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @david_rccv's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3117</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1335</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>187</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1595</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @david_rccv's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/david_rccv'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 433, 78, 9, 170, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/745895338003828736/rrplzLVB_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">David Gasquez 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@davidgasquez bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@davidgasquez's tweets](https://twitter.com/davidgasquez). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3065</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>674</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>66</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2325</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/26tasjrn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @davidgasquez's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3ez0xoyl) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3ez0xoyl/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/davidgasquez'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/davidgasquez/1600679713505/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/davidgasquez
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">David Gasquez AI Bot </div> <div style="font-size: 15px; color: #657786">@davidgasquez bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @davidgasquez's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3065</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>674</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>66</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2325</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @davidgasquez's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/davidgasquez'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @davidgasquez's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3065</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>674</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>66</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2325</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @davidgasquez's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/davidgasquez'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @davidgasquez's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3065</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>674</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>66</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2325</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @davidgasquez's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/davidgasquez'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 431, 76, 9, 168, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/792165528752140288/liCCmoI2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">David Goggins 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@davidgoggins bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@davidgoggins's tweets](https://twitter.com/davidgoggins). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>557</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>10</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>75</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>472</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3bgqr5vh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @davidgoggins's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/13i4mcyp) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/13i4mcyp/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/davidgoggins'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/davidgoggins/1603830361250/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/davidgoggins
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">David Goggins AI Bot </div> <div style="font-size: 15px; color: #657786">@davidgoggins bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @davidgoggins's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>557</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>10</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>75</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>472</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @davidgoggins's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/davidgoggins'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @davidgoggins's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>557</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>10</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>75</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>472</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @davidgoggins's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/davidgoggins'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @davidgoggins's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>557</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>10</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>75</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>472</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @davidgoggins's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/davidgoggins'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 429, 77, 9, 169, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1420788979931168770/6f7XUDnW_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Mighty</div> <div style="text-align: center; font-size: 14px;">@davidlisowsky</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Mighty. | Data | Mighty | | --- | --- | | Tweets downloaded | 156 | | Retweets | 53 | | Short tweets | 15 | | Tweets kept | 88 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2hyhz2eh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @davidlisowsky's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2v0yazpf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2v0yazpf/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/davidlisowsky') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/davidlisowsky/1631500152718/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/davidlisowsky
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Mighty @davidlisowsky I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Mighty. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @davidlisowsky's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/996598862189035520/7TV9Dej2_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">David R. Liu</div> <div style="text-align: center; font-size: 14px;">@davidrliu</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from David R. Liu. | Data | David R. Liu | | --- | --- | | Tweets downloaded | 2124 | | Retweets | 952 | | Short tweets | 62 | | Tweets kept | 1110 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/29r3m2zm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @davidrliu's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/27i98foi) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/27i98foi/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/davidrliu') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/davidrliu/1622126441318/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/davidrliu
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT David R. Liu @davidrliu I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from David R. Liu. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @davidrliu's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1395199219779219458/CNIBnZac_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">David Vizgan</div> <div style="text-align: center; font-size: 14px;">@davidvizgan</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from David Vizgan. | Data | David Vizgan | | --- | --- | | Tweets downloaded | 2075 | | Retweets | 563 | | Short tweets | 302 | | Tweets kept | 1210 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/36psf0c4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @davidvizgan's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/23t5t1ij) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/23t5t1ij/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/davidvizgan') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/davidvizgan/1623099475956/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/davidvizgan
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT David Vizgan @davidvizgan I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from David Vizgan. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @davidvizgan's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1347223440328155144/QPmnvgm8_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">dawniee🌸☁️🌈 🤖 AI Bot </div> <div style="font-size: 15px">@dawnieedreams bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dawnieedreams's tweets](https://twitter.com/dawnieedreams). | Data | Quantity | | --- | --- | | Tweets downloaded | 3225 | | Retweets | 488 | | Short tweets | 441 | | Tweets kept | 2296 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/bnevhdny/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dawnieedreams's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vytig6y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vytig6y/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dawnieedreams') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dawnieedreams
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
dawniee️ AI Bot @dawnieedreams bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dawnieedreams's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dawnieedreams's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1163922647/db002_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Devlet Bahçeli</div> <div style="text-align: center; font-size: 14px;">@dbdevletbahceli</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Devlet Bahçeli. | Data | Devlet Bahçeli | | --- | --- | | Tweets downloaded | 3200 | | Retweets | 0 | | Short tweets | 19 | | Tweets kept | 3181 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/ni0ttu3d/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dbdevletbahceli's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ois198tw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ois198tw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dbdevletbahceli') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dbdevletbahceli/1625817202615/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dbdevletbahceli
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Devlet Bahçeli @dbdevletbahceli I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Devlet Bahçeli. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dbdevletbahceli's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/711528302616432645/vRMgz3f8_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dale Dorsey 🤖 AI Bot </div> <div style="font-size: 15px">@dd0031 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dd0031's tweets](https://twitter.com/dd0031). | Data | Quantity | | --- | --- | | Tweets downloaded | 177 | | Retweets | 65 | | Short tweets | 18 | | Tweets kept | 94 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3hyz1y2u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dd0031's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/82633lpl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/82633lpl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dd0031') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dd0031/1616729814364/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dd0031
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Dale Dorsey AI Bot @dd0031 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dd0031's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dd0031's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1166296863068360704/9Rbf-i7O_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ddlc quote bot 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@ddlcquotes bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ddlcquotes's tweets](https://twitter.com/ddlcquotes). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3203</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>27</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3176</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3vugceit/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ddlcquotes's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1rh6mzov) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1rh6mzov/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ddlcquotes'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ddlcquotes/1612815814568/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ddlcquotes
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">ddlc quote bot AI Bot </div> <div style="font-size: 15px; color: #657786">@ddlcquotes bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @ddlcquotes's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3203</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>27</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3176</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @ddlcquotes's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ddlcquotes'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @ddlcquotes's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3203</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>27</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>3176</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @ddlcquotes's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/ddlcquotes'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @ddlcquotes's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3203</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>27</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>3176</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @ddlcquotes's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/ddlcquotes'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 430, 76, 9, 168, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1349980097168740352/GSthZg8p_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">persona non greta</div> <div style="text-align: center; font-size: 14px;">@dead__bug</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from persona non greta. | Data | persona non greta | | --- | --- | | Tweets downloaded | 3095 | | Retweets | 449 | | Short tweets | 623 | | Tweets kept | 2023 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/oyzjw1jc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dead__bug's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/20dghuyx) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/20dghuyx/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dead__bug') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dead__bug/1627231954071/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dead__bug
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT persona non greta @dead\_\_bug I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from persona non greta. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dead\_\_bug's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1145694071270326273/GPZwtxlf_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">DEA HQ 🤖 AI Bot </div> <div style="font-size: 15px">@deahq bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deahq's tweets](https://twitter.com/deahq). | Data | Quantity | | --- | --- | | Tweets downloaded | 3226 | | Retweets | 1568 | | Short tweets | 25 | | Tweets kept | 1633 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1th11ksq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deahq's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vxabqh4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vxabqh4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deahq') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deahq
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
DEA HQ AI Bot @deahq bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @deahq's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deahq's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1172237959032201219/8tZPfA9n_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">💟 Dealing Porn 💟 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@dealingporn bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dealingporn's tweets](https://twitter.com/dealingporn). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3213</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>2141</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1072</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/10cc3bw9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dealingporn's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2d2vjn6v) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2d2vjn6v/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dealingporn'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dealingporn
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800"> Dealing Porn AI Bot </div> <div style="font-size: 15px; color: #657786">@dealingporn bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @dealingporn's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3213</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>2141</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1072</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @dealingporn's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dealingporn'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @dealingporn's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3213</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>2141</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1072</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @dealingporn's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/dealingporn'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @dealingporn's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3213</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>2141</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1072</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @dealingporn's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/dealingporn'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 430, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1295317265257238529/8q3IptgS_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Death Battle Bot</div> <div style="text-align: center; font-size: 14px;">@deathbattlebot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Death Battle Bot. | Data | Death Battle Bot | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 1 | | Short tweets | 20 | | Tweets kept | 3229 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1hcf8oqg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deathbattlebot's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2d0vuhj5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2d0vuhj5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deathbattlebot') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deathbattlebot/1626676974616/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deathbattlebot
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Death Battle Bot @deathbattlebot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Death Battle Bot. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deathbattlebot's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1342266658316906496/UU6n9Qc-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">numen 🤖 AI Bot </div> <div style="font-size: 15px">@decadantism bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@decadantism's tweets](https://twitter.com/decadantism). | Data | Quantity | | --- | --- | | Tweets downloaded | 1922 | | Retweets | 50 | | Short tweets | 142 | | Tweets kept | 1730 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/28bmv6dp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @decadantism's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1657a5q3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1657a5q3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/decadantism') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/decadantism/1617759987918/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/decadantism
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
numen AI Bot @decadantism bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @decadantism's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @decadantism's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1219604909382754304/dP1klRbB_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">decodem.ai 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@decodemai bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@decodemai's tweets](https://twitter.com/decodemai). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>97</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>3</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>17</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>77</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/jjvvorob/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @decodemai's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ilv1sdu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ilv1sdu/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/decodemai'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/decodemai/1609942356404/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/decodemai
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">URL AI Bot </div> <div style="font-size: 15px; color: #657786">@decodemai bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @decodemai's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>97</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>3</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>17</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>77</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @decodemai's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/decodemai'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @decodemai's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>97</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>3</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>17</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>77</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @decodemai's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/decodemai'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @decodemai's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>97</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>3</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>17</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>77</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @decodemai's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/decodemai'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 427, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379273929961861122/3GjrPmt5_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">rei ayanami appreciator 🤖 AI Bot </div> <div style="font-size: 15px">@decoratedboar bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@decoratedboar's tweets](https://twitter.com/decoratedboar). | Data | Quantity | | --- | --- | | Tweets downloaded | 3226 | | Retweets | 1039 | | Short tweets | 408 | | Tweets kept | 1779 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3jw8lo8p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @decoratedboar's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2opldfex) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2opldfex/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/decoratedboar') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/decoratedboar/1617764745447/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/decoratedboar
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
rei ayanami appreciator AI Bot @decoratedboar bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @decoratedboar's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @decoratedboar's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1460219777574907905/vsqt8Fcx_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">sadie 🦝</div> <div style="text-align: center; font-size: 14px;">@deddogoon</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from sadie 🦝. | Data | sadie 🦝 | | --- | --- | | Tweets downloaded | 2566 | | Retweets | 461 | | Short tweets | 788 | | Tweets kept | 1317 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2xu99dv1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deddogoon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2i09ou5n) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2i09ou5n/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deddogoon') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/deddogoon/1651202174338/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deddogoon
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT sadie @deddogoon I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from sadie . Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deddogoon's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1365551575599562752/z281o-qD_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">👁️ Deep Thrill 😎 🤖 AI Bot </div> <div style="font-size: 15px">@deeperthrill bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deeperthrill's tweets](https://twitter.com/deeperthrill). | Data | Quantity | | --- | --- | | Tweets downloaded | 3235 | | Retweets | 2415 | | Short tweets | 165 | | Tweets kept | 655 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3t139cp8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deeperthrill's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/9rc0g39n) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/9rc0g39n/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deeperthrill') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deeperthrill/1616001334930/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deeperthrill
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
️ Deep Thrill AI Bot @deeperthrill bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @deeperthrill's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deeperthrill's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360640553373716482/Ai7f4fzH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">invisible college dropout 🤖 AI Bot </div> <div style="font-size: 15px">@deepfates bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deepfates's tweets](https://twitter.com/deepfates). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 230 | | Short tweets | 647 | | Tweets kept | 2369 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3jkblzyg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepfates's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3lia0i7h) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3lia0i7h/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepfates') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deepfates/1617047160223/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deepfates
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
invisible college dropout AI Bot @deepfates bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @deepfates's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deepfates's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1241879678455078914/e2EdZIrr_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1383905819217911808/AIWNRt5y_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1421970043269693450/kDxxMQub_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Deep Leffen Bot & dodo82.jp & TSM FTX Leffen</div> <div style="text-align: center; font-size: 14px;">@deepleffen-dodo82j-tsm_leffen</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Deep Leffen Bot & dodo82.jp & TSM FTX Leffen. | Data | Deep Leffen Bot | dodo82.jp | TSM FTX Leffen | | --- | --- | --- | --- | | Tweets downloaded | 505 | 220 | 3249 | | Retweets | 13 | 32 | 368 | | Short tweets | 26 | 26 | 142 | | Tweets kept | 466 | 162 | 2739 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/102ri7zl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepleffen-dodo82j-tsm_leffen's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2m0x83ro) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2m0x83ro/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepleffen-dodo82j-tsm_leffen') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deepleffen-dodo82j-tsm_leffen/1628743643357/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deepleffen-dodo82j-tsm_leffen
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Deep Leffen Bot & URL & TSM FTX Leffen @deepleffen-dodo82j-tsm\_leffen I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Deep Leffen Bot & URL & TSM FTX Leffen. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deepleffen-dodo82j-tsm\_leffen's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1241879678455078914/e2EdZIrr_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1383905819217911808/AIWNRt5y_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Deep Leffen Bot & dodo82.jp</div> <div style="text-align: center; font-size: 14px;">@deepleffen-dodo82j</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Deep Leffen Bot & dodo82.jp. | Data | Deep Leffen Bot | dodo82.jp | | --- | --- | --- | | Tweets downloaded | 505 | 220 | | Retweets | 13 | 32 | | Short tweets | 26 | 26 | | Tweets kept | 466 | 162 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/q5s6fe5u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepleffen-dodo82j's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3gbchfdc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3gbchfdc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepleffen-dodo82j') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deepleffen-dodo82j/1628739655504/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deepleffen-dodo82j
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Deep Leffen Bot & URL @deepleffen-dodo82j I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Deep Leffen Bot & URL. Data: Tweets downloaded, Deep Leffen Bot: 505, URL: 220 Data: Retweets, Deep Leffen Bot: 13, URL: 32 Data: Short tweets, Deep Leffen Bot: 26, URL: 26 Data: Tweets kept, Deep Leffen Bot: 466, URL: 162 Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deepleffen-dodo82j's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/847818629840228354/VXyQHfn0_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1241879678455078914/e2EdZIrr_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1425624685190946817/QM0oy_7p_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">wint & Deep Leffen Bot & LUMBAGO (1984)</div> <div style="text-align: center; font-size: 14px;">@deepleffen-dril-twomad</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from wint & Deep Leffen Bot & LUMBAGO (1984). | Data | wint | Deep Leffen Bot | LUMBAGO (1984) | | --- | --- | --- | --- | | Tweets downloaded | 3203 | 505 | 3249 | | Retweets | 459 | 13 | 61 | | Short tweets | 311 | 26 | 1693 | | Tweets kept | 2433 | 466 | 1495 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1kvrfp57/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepleffen-dril-twomad's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/6koeu45s) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/6koeu45s/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepleffen-dril-twomad') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deepleffen-dril-twomad/1628758133714/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deepleffen-dril-twomad
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG wint & Deep Leffen Bot & LUMBAGO (1984) @deepleffen-dril-twomad I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from wint & Deep Leffen Bot & LUMBAGO (1984). Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deepleffen-dril-twomad's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1241879678455078914/e2EdZIrr_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/847818629840228354/VXyQHfn0_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Deep Leffen Bot & wint</div> <div style="text-align: center; font-size: 14px;">@deepleffen-dril</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Deep Leffen Bot & wint. | Data | Deep Leffen Bot | wint | | --- | --- | --- | | Tweets downloaded | 506 | 3209 | | Retweets | 13 | 463 | | Short tweets | 26 | 311 | | Tweets kept | 467 | 2435 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/29zfoi4y/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepleffen-dril's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2fygim56) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2fygim56/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepleffen-dril') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deepleffen-dril/1628834161509/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deepleffen-dril
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Deep Leffen Bot & wint @deepleffen-dril I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Deep Leffen Bot & wint. Data: Tweets downloaded, Deep Leffen Bot: 506, wint: 3209 Data: Retweets, Deep Leffen Bot: 13, wint: 463 Data: Short tweets, Deep Leffen Bot: 26, wint: 311 Data: Tweets kept, Deep Leffen Bot: 467, wint: 2435 Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deepleffen-dril's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1241879678455078914/e2EdZIrr_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1386749605216407555/QIJeyWfE_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1425624685190946817/QM0oy_7p_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Deep Leffen Bot & wint but Al & LUMBAGO (1984)</div> <div style="text-align: center; font-size: 14px;">@deepleffen-dril_gpt2-twomad</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Deep Leffen Bot & wint but Al & LUMBAGO (1984). | Data | Deep Leffen Bot | wint but Al | LUMBAGO (1984) | | --- | --- | --- | --- | | Tweets downloaded | 505 | 3248 | 3249 | | Retweets | 13 | 41 | 61 | | Short tweets | 26 | 49 | 1691 | | Tweets kept | 466 | 3158 | 1497 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/363d721t/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepleffen-dril_gpt2-twomad's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3gtnp6sz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3gtnp6sz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepleffen-dril_gpt2-twomad') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deepleffen-dril_gpt2-twomad/1628757298537/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deepleffen-dril_gpt2-twomad
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Deep Leffen Bot & wint but Al & LUMBAGO (1984) @deepleffen-dril\_gpt2-twomad I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Deep Leffen Bot & wint but Al & LUMBAGO (1984). Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deepleffen-dril\_gpt2-twomad's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1241879678455078914/e2EdZIrr_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1419130410659889153/F2F8J5kC_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Deep Leffen Bot & ابن ڪربلاء 🇮🇶🇵🇸</div> <div style="text-align: center; font-size: 14px;">@deepleffen-ibnalrafidayn</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Deep Leffen Bot & ابن ڪربلاء 🇮🇶🇵🇸. | Data | Deep Leffen Bot | ابن ڪربلاء 🇮🇶🇵🇸 | | --- | --- | --- | | Tweets downloaded | 497 | 3157 | | Retweets | 13 | 1624 | | Short tweets | 26 | 149 | | Tweets kept | 458 | 1384 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/nxq13p47/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepleffen-ibnalrafidayn's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/40mnb9ye) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/40mnb9ye/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepleffen-ibnalrafidayn') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deepleffen-ibnalrafidayn/1627628633670/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deepleffen-ibnalrafidayn
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Deep Leffen Bot & ابن ڪربلاء 🇮🇶🇵🇸 @deepleffen-ibnalrafidayn I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Deep Leffen Bot & ابن ڪربلاء 🇮🇶🇵🇸. Data: Tweets downloaded, Deep Leffen Bot: 497, ابن ڪربلاء 🇮🇶🇵🇸: 3157 Data: Retweets, Deep Leffen Bot: 13, ابن ڪربلاء 🇮🇶🇵🇸: 1624 Data: Short tweets, Deep Leffen Bot: 26, ابن ڪربلاء 🇮🇶🇵🇸: 149 Data: Tweets kept, Deep Leffen Bot: 458, ابن ڪربلاء 🇮🇶🇵🇸: 1384 Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deepleffen-ibnalrafidayn's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1241879678455078914/e2EdZIrr_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1425624685190946817/QM0oy_7p_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1104281298967904257/KuDWZQfF_400x400.png&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Deep Leffen Bot & LUMBAGO (1984) & Schlatt</div> <div style="text-align: center; font-size: 14px;">@deepleffen-jschlatt-twomad</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Deep Leffen Bot & LUMBAGO (1984) & Schlatt. | Data | Deep Leffen Bot | LUMBAGO (1984) | Schlatt | | --- | --- | --- | --- | | Tweets downloaded | 505 | 3249 | 3250 | | Retweets | 13 | 62 | 3 | | Short tweets | 26 | 1691 | 1236 | | Tweets kept | 466 | 1496 | 2011 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/tchb83i1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepleffen-jschlatt-twomad's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/35gw3gup) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/35gw3gup/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepleffen-jschlatt-twomad') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deepleffen-jschlatt-twomad/1628748624093/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deepleffen-jschlatt-twomad
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Deep Leffen Bot & LUMBAGO (1984) & Schlatt @deepleffen-jschlatt-twomad I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Deep Leffen Bot & LUMBAGO (1984) & Schlatt. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deepleffen-jschlatt-twomad's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1241879678455078914/e2EdZIrr_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Deep Leffen Bot</div> <div style="text-align: center; font-size: 14px;">@deepleffen</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Deep Leffen Bot. | Data | Deep Leffen Bot | | --- | --- | | Tweets downloaded | 589 | | Retweets | 14 | | Short tweets | 27 | | Tweets kept | 548 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1p32tock/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deepleffen's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/imjjixah) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/imjjixah/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deepleffen') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/deepleffen/1654277690184/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deepleffen
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Deep Leffen Bot @deepleffen I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Deep Leffen Bot. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deepleffen's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1364302532080762882/8_tNRrto_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ryan 🤖 AI Bot </div> <div style="font-size: 15px">@defnotreal_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@defnotreal_'s tweets](https://twitter.com/defnotreal_). | Data | Quantity | | --- | --- | | Tweets downloaded | 2920 | | Retweets | 2437 | | Short tweets | 113 | | Tweets kept | 370 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3lvnbvmn/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @defnotreal_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3uffbfpz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3uffbfpz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/defnotreal_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/defnotreal_/1616212090089/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/defnotreal_
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Ryan AI Bot @defnotreal\_ bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @defnotreal\_'s tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @defnotreal\_'s tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/847818629840228354/VXyQHfn0_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/58546628/goat22_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/726824334002638848/BEZFr1k8_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">wint & deg & Fred Delicious</div> <div style="text-align: center; font-size: 14px;">@degg-dril-fred_delicious</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from wint & deg & Fred Delicious. | Data | wint | deg | Fred Delicious | | --- | --- | --- | --- | | Tweets downloaded | 3227 | 3152 | 3235 | | Retweets | 473 | 142 | 429 | | Short tweets | 318 | 42 | 398 | | Tweets kept | 2436 | 2968 | 2408 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1mwoed1f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @degg-dril-fred_delicious's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1a691ucn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1a691ucn/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/degg-dril-fred_delicious') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/degg-dril-fred_delicious/1634845142916/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/degg-dril-fred_delicious
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
AI CYBORG wint & deg & Fred Delicious @degg-dril-fred\_delicious I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from wint & deg & Fred Delicious. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @degg-dril-fred\_delicious's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n" ]
[ 58 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n" ]
[ 0.01490766927599907, -0.029707439243793488, -0.005551214329898357, 0.019200731068849564, 0.13538123667240143, 0.031960584223270416, 0.07854003459215164, 0.15176640450954437, -0.03478027135133743, 0.020825274288654327, 0.17780765891075134, 0.13369141519069672, -0.025615064427256584, 0.10110418498516083, -0.04495277628302574, -0.27562591433525085, 0.05076512694358826, 0.04804648086428642, -0.03660755604505539, 0.13914404809474945, 0.08130049705505371, -0.04002818092703819, 0.09946437180042267, -0.02348330244421959, -0.18448196351528168, 0.03603345900774002, 0.03874115273356438, -0.10113959014415741, 0.1245182529091835, 0.042210761457681656, 0.097016341984272, 0.02144498936831951, -0.08173582702875137, -0.09949471801519394, 0.04076211526989937, 0.0524214468896389, -0.06747427582740784, 0.07371947914361954, 0.06548287719488144, -0.09104428440332413, 0.15018822252750397, 0.032346583902835846, -0.015613866969943047, 0.05642572045326233, -0.17452795803546906, -0.05158021301031113, -0.038718461990356445, 0.007089211139827967, 0.026932736858725548, 0.0765126645565033, -0.03151388466358185, 0.1801837533712387, -0.09969375282526016, 0.08080597966909409, 0.19449612498283386, -0.3136744201183319, -0.008555608801543713, 0.09032562375068665, 0.10929754376411438, 0.04874800145626068, -0.027296993881464005, 0.0915885642170906, 0.06193246319890022, 0.01960965059697628, 0.03833269700407982, -0.04861060902476311, -0.10319411009550095, 0.05647158622741699, -0.09057577699422836, -0.06054073944687843, 0.21635963022708893, -0.04008340463042259, 0.06982671469449997, -0.06175680458545685, -0.10815948247909546, -0.04076464846730232, -0.0019450898980721831, 0.007722846698015928, -0.04665110260248184, 0.07648039609193802, -0.018055662512779236, -0.07031384110450745, -0.16104750335216522, 0.004024024587124586, -0.19634990394115448, 0.15040342509746552, -0.007091521751135588, 0.03826533630490303, -0.18292304873466492, 0.11239926517009735, -0.00463539082556963, -0.09352986514568329, 0.05768691375851631, -0.09079932421445847, 0.06515228003263474, 0.012864203192293644, -0.07714717835187912, -0.01476553175598383, 0.07529455423355103, 0.14459288120269775, -0.03202451393008232, -0.014661530964076519, 0.005731707438826561, 0.08028950542211533, 0.0649947077035904, 0.0529506653547287, -0.05044480413198471, -0.04604479670524597, 0.023471293970942497, -0.12279047816991806, 0.005323461256921291, -0.08722265064716339, -0.11850544810295105, -0.06324991583824158, 0.02913680486381054, 0.0409526452422142, 0.05413410812616348, 0.11646701395511627, -0.03819722682237625, 0.003125704126432538, 0.0790749341249466, -0.04754907265305519, 0.014819944277405739, -0.019563961774110794, 0.019939353689551353, 0.10169259458780289, -0.004264532588422298, 0.02739645354449749, -0.08098349720239639, 0.053036682307720184, -0.10909318923950195, -0.01619906909763813, -0.01401914469897747, -0.08360559493303299, 0.03495157137513161, -0.13044656813144684, 0.003459199797362089, -0.1794617623090744, -0.08515556156635284, 0.004473025444895029, -0.02669714391231537, -0.029448989778757095, -0.07044726610183716, -0.000012087725735909771, -0.03782980516552925, 0.09037511795759201, -0.049331095069646835, 0.02187872678041458, -0.05999310687184334, 0.10105309635400772, -0.06390658020973206, 0.10026112198829651, -0.14347022771835327, 0.062101077288389206, -0.13453714549541473, -0.007185438182204962, -0.056844428181648254, 0.06125464290380478, 0.003428247757256031, 0.14510339498519897, -0.0019743351731449366, -0.024058373644948006, -0.11893454194068909, 0.07660667598247528, -0.013276024721562862, 0.20594745874404907, -0.08867699652910233, -0.12363981455564499, 0.18977078795433044, -0.05639764666557312, -0.13151882588863373, 0.12418028712272644, 0.014429234899580479, 0.0632154569029808, 0.06213442608714104, 0.22679246962070465, 0.019381923601031303, -0.012931653298437595, 0.022296493873000145, 0.09503678232431412, -0.14555561542510986, -0.009661003015935421, 0.00841028243303299, -0.0034040703903883696, -0.08606491982936859, 0.03532338887453079, 0.11260384321212769, 0.09308808296918869, -0.06410733610391617, -0.02130906470119953, -0.04973118007183075, -0.002825164934620261, 0.08984078466892242, -0.004783526994287968, 0.10731708258390427, -0.1087048277258873, -0.0696249008178711, -0.030250849202275276, -0.0012943379115313292, 0.019239740446209908, 0.047808386385440826, -0.02344650961458683, 0.12562713027000427, -0.007734235376119614, 0.03894360736012459, -0.1425095796585083, -0.08136747777462006, -0.0352344736456871, 0.1538124978542328, 0.043824564665555954, 0.12064811587333679, 0.05824866518378258, -0.05751828849315643, -0.01182011142373085, -0.0009749550954438746, 0.1436033844947815, -0.01271585002541542, -0.0852496474981308, -0.05857346951961517, 0.08234002441167831, -0.07005385309457779, 0.025747623294591904, -0.044775284826755524, 0.034807976335287094, 0.07559926062822342, 0.12458331137895584, -0.012196341529488564, 0.03711283579468727, -0.0114806042984128, -0.00011358146002748981, -0.0896436795592308, -0.017758425325155258, 0.08426769077777863, -0.01031328085809946, -0.06891145557165146, 0.2548580765724182, -0.20196770131587982, 0.2299574315547943, 0.2296006977558136, -0.24926680326461792, -0.031021486967802048, -0.033769410103559494, -0.053823258727788925, 0.017695831134915352, 0.037970591336488724, -0.06758655607700348, 0.04836089164018631, -0.04798879474401474, 0.14743368327617645, -0.039952509105205536, -0.067154660820961, 0.011684614233672619, -0.06448059529066086, -0.05935867503285408, 0.06313008069992065, 0.042497556656599045, -0.12196247279644012, 0.19907300174236298, 0.256646990776062, 0.05731065943837166, 0.19537882506847382, 0.018505526706576347, -0.008647909387946129, 0.05532584711909294, -0.06024787575006485, -0.05939318984746933, -0.047756556421518326, -0.17080695927143097, -0.04192587733268738, 0.08264796435832977, 0.03818074241280556, 0.10142038762569427, -0.10767655074596405, -0.07215311378240585, 0.009001186117529869, 0.015298610553145409, -0.0071457610465586185, 0.13598360121250153, 0.0518062487244606, 0.13914446532726288, -0.005930832587182522, 0.025881925597786903, 0.06802397966384888, 0.02908273972570896, -0.08634071797132492, 0.1456945538520813, -0.1443340927362442, -0.3761582672595978, -0.1337488889694214, -0.09692523628473282, -0.01728254184126854, 0.04180695861577988, 0.12187701463699341, -0.12770286202430725, 0.0015871572541072965, -0.038519375026226044, 0.11093265563249588, -0.08619660884141922, 0.05064225569367409, -0.09312444925308228, 0.013121986761689186, -0.05371590703725815, -0.09208092838525772, -0.03968694806098938, -0.025510141626000404, -0.09566790610551834, 0.17000380158424377, -0.06683114916086197, 0.06858081370592117, 0.18506819009780884, 0.00719145592302084, 0.0289844311773777, -0.048802170902490616, 0.19252337515354156, -0.11087619513273239, 0.03966445475816727, 0.16040337085723877, 0.017642250284552574, 0.0870000347495079, 0.1063152626156807, -0.012242939323186874, -0.0729413852095604, 0.045604314655065536, 0.00838512647897005, -0.1192161962389946, -0.1628645360469818, -0.12800830602645874, -0.09412717819213867, 0.11832976341247559, 0.04896165430545807, 0.06800009310245514, 0.15884871780872345, 0.07476164400577545, -0.026399539783596992, -0.014989175833761692, -0.030628371983766556, 0.06698833405971527, 0.1810096949338913, -0.03410731256008148, 0.14575767517089844, -0.058808114379644394, -0.11332095414400101, 0.13757702708244324, 0.02930915355682373, 0.041703663766384125, 0.02182352915406227, -0.00909164547920227, 0.002967006294056773, 0.1275966912508011, 0.12905162572860718, 0.09085892140865326, -0.009210709482431412, -0.02600487507879734, -0.043374914675951004, -0.013961076736450195, -0.004321942571550608, 0.0544881634414196, 0.04976500943303108, -0.17056837677955627, -0.05252770707011223, -0.15036118030548096, 0.10635104030370712, 0.0658026859164238, 0.1097705140709877, -0.1943172961473465, -0.002813685918226838, 0.07878942042589188, -0.032336506992578506, -0.10688291490077972, 0.0711626410484314, 0.08084181696176529, -0.10172833502292633, 0.061861153692007065, -0.006532561499625444, 0.10406243056058884, 0.023742493242025375, 0.09878388792276382, -0.05866028368473053, -0.06148262321949005, -0.016518592834472656, 0.0939461812376976, -0.2926773428916931, 0.18179792165756226, -0.02907240390777588, -0.09844435751438141, -0.07134795933961868, -0.027840720489621162, 0.02770385891199112, 0.08060616999864578, 0.07026281952857971, 0.03338824585080147, 0.004331306088715792, -0.096750907599926, -0.025082198902964592, 0.01906721480190754, 0.12366755306720734, -0.05625581368803978, -0.01350511983036995, -0.041342347860336304, 0.02173074521124363, 0.00881099235266447, 0.06707418709993362, 0.021282022818922997, -0.16175130009651184, 0.07081238180398941, 0.03874411806464195, 0.03632965683937073, 0.02809896133840084, -0.02992626093327999, -0.1548270583152771, 0.1699383705854416, 0.02149343490600586, -0.06355534493923187, -0.12458667904138565, -0.04415702819824219, 0.06474760919809341, -0.04559854418039322, 0.039427343755960464, -0.0629853680729866, -0.001072446582838893, -0.07396374642848969, -0.19678819179534912, 0.1352648138999939, -0.06877507269382477, -0.07956159859895706, -0.03609750047326088, 0.17941854894161224, -0.07844837009906769, 0.018584884703159332, 0.007694936357438564, 0.047691553831100464, -0.14645716547966003, -0.11046357452869415, 0.06859224289655685, -0.04709484800696373, 0.036693572998046875, 0.01622764952480793, -0.02127574197947979, 0.005549534223973751, -0.012371892109513283, -0.0025305403396487236, 0.27376097440719604, 0.22965525090694427, -0.0802733525633812, 0.17710179090499878, 0.0630764365196228, -0.0507618673145771, -0.326980859041214, -0.09341348707675934, -0.14117297530174255, -0.009823985397815704, 0.023037875071167946, -0.13106577098369598, 0.05048951879143715, 0.029267780482769012, -0.014271927997469902, 0.11641538888216019, -0.20954635739326477, -0.09825599938631058, 0.09716890752315521, -0.06865759193897247, 0.4034286141395569, -0.1301247775554657, -0.07252515107393265, -0.03537401184439659, -0.14370368421077728, 0.1951674520969391, -0.06888531148433685, 0.09445398300886154, -0.02521556057035923, 0.13606955111026764, 0.05328543111681938, -0.02234726957976818, 0.11178411543369293, -0.01972258649766445, 0.0011309145484119654, -0.13865596055984497, -0.0604546032845974, 0.08001494407653809, -0.007717274595052004, 0.009300438687205315, -0.10272051393985748, 0.02493043802678585, -0.1687590479850769, -0.0011172585655003786, -0.11747178435325623, 0.09020119160413742, 0.021906770765781403, -0.06782494485378265, -0.050800591707229614, -0.0397600457072258, -0.009170935489237309, -0.01781470514833927, 0.14266571402549744, -0.05449723079800606, 0.1958875209093094, 0.07874593138694763, 0.06835102289915085, -0.1398908495903015, 0.05150968208909035, -0.027329890057444572, -0.0722222700715065, 0.07193705439567566, -0.1763947606086731, 0.04665200039744377, 0.08747349679470062, -0.03685600683093071, 0.04361190274357796, 0.09054525941610336, 0.0016616116045042872, 0.015259911306202412, 0.17542323470115662, -0.26249444484710693, -0.009265073575079441, -0.05851370841264725, -0.07754334062337875, 0.09399785101413727, 0.05022705718874931, 0.17302869260311127, 0.01883954182267189, -0.042071253061294556, 0.01570643112063408, 0.01509455218911171, -0.059037789702415466, 0.045513592660427094, 0.02684464305639267, 0.008570663630962372, -0.1221366673707962, 0.05156322568655014, 0.023951835930347443, -0.13511396944522858, 0.028608962893486023, 0.18224509060382843, -0.11025255918502808, -0.12385176122188568, -0.05918734520673752, 0.014604943804442883, -0.1147887259721756, 0.022507676854729652, -0.0024671610444784164, -0.09835006296634674, 0.062000639736652374, 0.13879112899303436, 0.06231173500418663, 0.12207856774330139, -0.016749046742916107, -0.009889025241136551, -0.02596008963882923, -0.03651546314358711, 0.030208537355065346, 0.03168511390686035, -0.0861181765794754, 0.11346250027418137, -0.02512839064002037, 0.1417820155620575, -0.10664749890565872, -0.05201685056090355, -0.14300444722175598, -0.032158151268959045, -0.08575893938541412, -0.11310356855392456, -0.07854323089122772, -0.07035478204488754, 0.009079275652766228, -0.06144321709871292, -0.04742341861128807, -0.07850945740938187, -0.1071249470114708, 0.0065527972765266895, -0.04037778079509735, 0.04593488946557045, -0.08211231976747513, -0.0025119397323578596, 0.11637649685144424, -0.022535819560289383, 0.16392095386981964, 0.11117493361234665, -0.09679865092039108, 0.07616551965475082, -0.12686336040496826, -0.10472826659679413, 0.09312791377305984, -0.011926734820008278, 0.03812364861369133, 0.11054535210132599, 0.005196572281420231, 0.026585882529616356, 0.05407439172267914, 0.06366553902626038, 0.03967318683862686, -0.10743927955627441, 0.0886315256357193, -0.020753242075443268, -0.1556011289358139, -0.045647621154785156, -0.0805148333311081, 0.03534896299242973, 0.01974404975771904, 0.10271792113780975, -0.034986238926649094, 0.0667109340429306, -0.08902619034051895, 0.024180470034480095, -0.000016180018064915203, -0.17815378308296204, -0.01795545034110546, -0.044853758066892624, 0.03495967015624046, 0.017773056402802467, 0.24157072603702545, 0.06085042282938957, -0.06760483235120773, 0.04657021537423134, 0.12547647953033447, 0.015325317159295082, 0.0012204478261992335, 0.15906299650669098, 0.09267544746398926, -0.07720284163951874, -0.1355191469192505, 0.06737678498029709, 0.019282789900898933, -0.045628830790519714, 0.09535399079322815, 0.009011821821331978, 0.021824736148118973, 0.0726994201540947, -0.01307250652462244, 0.0016560814110562205, -0.10145208984613419, -0.11841893196105957, -0.038609180599451065, 0.056700821965932846, -0.03823903948068619, 0.08270184695720673, 0.1624554693698883, -0.01529943011701107, 0.038163430988788605, -0.03936288133263588, -0.01488962396979332, -0.12554575502872467, -0.1426725685596466, -0.06450272351503372, -0.16302600502967834, -0.007763232104480267, -0.08333444595336914, 0.06433288753032684, 0.07949802279472351, 0.059826187789440155, -0.04269853234291077, 0.07651268690824509, 0.041022781282663345, -0.10034794360399246, 0.07510355859994888, -0.02059151791036129, 0.05323530361056328, -0.021385202184319496, -0.019656827673316002, -0.10213424265384674, 0.05417940020561218, -0.020723329856991768, 0.0416044183075428, -0.05111391469836235, 0.009476713836193085, -0.17171601951122284, -0.1145220696926117, -0.06269708275794983, 0.05981360003352165, -0.05057463422417641, 0.05945824086666107, 0.01168898493051529, 0.011022492311894894, 0.016294511035084724, 0.24577881395816803, -0.06521986424922943, -0.04237864539027214, -0.039796773344278336, 0.1401647925376892, -0.010960960760712624, 0.07910837978124619, -0.04971875622868538, -0.026874123141169548, -0.11801496148109436, 0.31808507442474365, 0.3330231308937073, -0.09461544454097748, 0.042824193835258484, -0.011730561032891273, 0.03255566582083702, 0.12393459677696228, 0.12263859808444977, 0.1047649085521698, 0.23149316012859344, -0.07330518960952759, -0.04662523791193962, -0.0235530287027359, -0.011487046256661415, -0.07061693072319031, 0.09144455194473267, 0.024042300879955292, -0.06604029983282089, -0.04112962633371353, 0.061292614787817, -0.1968756765127182, 0.09409318119287491, -0.08292321115732193, -0.2009436935186386, -0.043561115860939026, 0.03980926424264908, 0.11663878709077835, -0.0058410209603607655, 0.12327833473682404, 0.0021221975330263376, -0.06928802281618118, 0.0007765711052343249, 0.021609993651509285, -0.21508292853832245, 0.024919532239437103, 0.07233009487390518, -0.12181923538446426, -0.003256353549659252, -0.03614840656518936, -0.0016090048011392355, 0.08110456168651581, 0.041294027119874954, -0.03950197249650955, 0.026108399033546448, 0.004547862336039543, -0.03431681916117668, -0.023192770779132843, 0.04429425671696663, 0.04074658825993538, -0.15583069622516632, 0.0852603018283844, -0.15913386642932892, 0.045099467039108276, -0.033806633204221725, 0.0012128398520871997, -0.010232296772301197, -0.0017561162821948528, -0.040829166769981384, 0.08149882405996323, 0.07333235442638397, -0.022076698020100594, -0.0061705452390015125, -0.06384896486997604, -0.055786989629268646, -0.027778856456279755, -0.08865642547607422, -0.09847963601350784, -0.12417391687631607, -0.10101866722106934, 0.09827395528554916, -0.022276965901255608, -0.17602279782295227, 0.008451612666249275, -0.09465660154819489, 0.06178250536322594, -0.14477479457855225, 0.11634210497140884, 0.08996126055717468, 0.0033104352187365294, 0.006666821893304586, -0.021679235622286797, 0.07054676860570908, 0.1143367812037468, -0.09097250550985336, -0.06963858008384705 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1361151177455468548/mGKDi3dV_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Degrassi No Context 🤖 AI Bot </div> <div style="font-size: 15px">@degrassinocontx bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@degrassinocontx's tweets](https://twitter.com/degrassinocontx). | Data | Quantity | | --- | --- | | Tweets downloaded | 3245 | | Retweets | 54 | | Short tweets | 1504 | | Tweets kept | 1687 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/mu201mzi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @degrassinocontx's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1wxznhll) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1wxznhll/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/degrassinocontx') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/degrassinocontx/1614122429501/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/degrassinocontx
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Degrassi No Context AI Bot @degrassinocontx bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @degrassinocontx's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @degrassinocontx's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1389779771396132864/YfpCtmQo_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Just the Tip 🤖 AI Bot </div> <div style="font-size: 15px">@deityofyoutube bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deityofyoutube's tweets](https://twitter.com/deityofyoutube). | Data | Quantity | | --- | --- | | Tweets downloaded | 1518 | | Retweets | 58 | | Short tweets | 47 | | Tweets kept | 1413 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3o3puxa8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deityofyoutube's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ou2v7h4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ou2v7h4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deityofyoutube') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deityofyoutube/1620252403590/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deityofyoutube
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Just the Tip AI Bot @deityofyoutube bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @deityofyoutube's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deityofyoutube's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360137906569027585/AMjiAhW6_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">evelyn 🤖 AI Bot </div> <div style="font-size: 15px">@deleteevelyn bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deleteevelyn's tweets](https://twitter.com/deleteevelyn). | Data | Quantity | | --- | --- | | Tweets downloaded | 3196 | | Retweets | 277 | | Short tweets | 358 | | Tweets kept | 2561 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1cgctgf3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deleteevelyn's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/inf2farl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/inf2farl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deleteevelyn') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deleteevelyn/1614106800270/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deleteevelyn
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
evelyn AI Bot @deleteevelyn bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @deleteevelyn's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deleteevelyn's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/2300677070/886b0055ea8f8e5ba55a58f8ea82dac8_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Delicious Tacos 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@delicious_tacos bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@delicious_tacos's tweets](https://twitter.com/delicious_tacos). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3214</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>817</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>578</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1819</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1573t9o6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @delicious_tacos's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/39svipdd) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/39svipdd/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/delicious_tacos'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/delicious_tacos
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Delicious Tacos AI Bot </div> <div style="font-size: 15px; color: #657786">@delicious_tacos bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @delicious_tacos's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3214</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>817</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>578</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1819</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @delicious_tacos's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/delicious_tacos'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @delicious_tacos's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3214</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>817</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>578</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1819</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @delicious_tacos's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/delicious_tacos'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @delicious_tacos's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3214</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>817</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>578</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1819</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @delicious_tacos's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/delicious_tacos'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 432, 77, 9, 169, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1460542430357381120/3QwgzK9N_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">🍕 Deliveroo France</div> <div style="text-align: center; font-size: 14px;">@deliveroo_fr</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from 🍕 Deliveroo France. | Data | 🍕 Deliveroo France | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 5 | | Short tweets | 209 | | Tweets kept | 3036 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/35lhnvsx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deliveroo_fr's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3544md47) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3544md47/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deliveroo_fr') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/deliveroo_fr/1639165126235/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deliveroo_fr
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Deliveroo France @deliveroo\_fr I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Deliveroo France. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deliveroo\_fr's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1335071835403284481/HWmtRssm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dace🐛🏠 🤖 AI Bot </div> <div style="font-size: 15px">@deliverydace bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deliverydace's tweets](https://twitter.com/deliverydace). | Data | Quantity | | --- | --- | | Tweets downloaded | 2003 | | Retweets | 169 | | Short tweets | 329 | | Tweets kept | 1505 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1826o3k3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deliverydace's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/24r42zx0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/24r42zx0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deliverydace') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deliverydace/1613630846956/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deliverydace
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Dace AI Bot @deliverydace bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @deliverydace's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deliverydace's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1423104984430911493/79-4rY0R_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">OptionsWolf</div> <div style="text-align: center; font-size: 14px;">@deltagammaqueen</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from OptionsWolf. | Data | OptionsWolf | | --- | --- | | Tweets downloaded | 3245 | | Retweets | 264 | | Short tweets | 1164 | | Tweets kept | 1817 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2tt0l3wo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deltagammaqueen's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/unz6kk43) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/unz6kk43/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deltagammaqueen') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deltagammaqueen/1628736507176/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deltagammaqueen
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT OptionsWolf @deltagammaqueen I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from OptionsWolf. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deltagammaqueen's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1354964611586547715/WIIHy349_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">rj bday (season) 🦜🍓💝 🤖 AI Bot </div> <div style="font-size: 15px">@demirenjun bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@demirenjun's tweets](https://twitter.com/demirenjun). | Data | Quantity | | --- | --- | | Tweets downloaded | 3199 | | Retweets | 800 | | Short tweets | 384 | | Tweets kept | 2015 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1bdlmgyb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @demirenjun's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ck8cxvw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ck8cxvw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/demirenjun') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/demirenjun/1617917661023/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/demirenjun
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
rj bday (season) AI Bot @demirenjun bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @demirenjun's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @demirenjun's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1378865749582872580/oTZARemq_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dení has returned. 🤖 AI Bot </div> <div style="font-size: 15px">@deni_is_aflor bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deni_is_aflor's tweets](https://twitter.com/deni_is_aflor). | Data | Quantity | | --- | --- | | Tweets downloaded | 3196 | | Retweets | 1101 | | Short tweets | 195 | | Tweets kept | 1900 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/22jo6jl8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deni_is_aflor's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/l4we4gl2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/l4we4gl2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deni_is_aflor') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deni_is_aflor/1617777629095/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deni_is_aflor
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Dení has returned. AI Bot @deni\_is\_aflor bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @deni\_is\_aflor's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deni\_is\_aflor's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1484264819049959425/siOsFP3t_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Den</div> <div style="text-align: center; font-size: 14px;">@denyah_</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Den. | Data | Den | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 464 | | Short tweets | 795 | | Tweets kept | 1985 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3e5c08gr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @denyah_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1438ocp8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1438ocp8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/denyah_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/denyah_/1643852632266/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/denyah_
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Den @denyah\_ I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Den. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @denyah\_'s tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1357656503566622720/PGCAnBgE_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">pete wolfendale 🤖 AI Bot </div> <div style="font-size: 15px">@deontologistics bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deontologistics's tweets](https://twitter.com/deontologistics). | Data | Quantity | | --- | --- | | Tweets downloaded | 3230 | | Retweets | 590 | | Short tweets | 187 | | Tweets kept | 2453 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ahwv4uv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deontologistics's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2dpgq6x6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2dpgq6x6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deontologistics') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deontologistics/1616689045190/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deontologistics
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
pete wolfendale AI Bot @deontologistics bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @deontologistics's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deontologistics's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1392219624956219402/HuLQmDB6_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">yo sophist</div> <div style="text-align: center; font-size: 14px;">@deptofsophistry</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from yo sophist. | Data | yo sophist | | --- | --- | | Tweets downloaded | 3215 | | Retweets | 327 | | Short tweets | 762 | | Tweets kept | 2126 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3p698zbi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deptofsophistry's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3nt0sevr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3nt0sevr/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deptofsophistry') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deptofsophistry/1621365721868/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deptofsophistry
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT yo sophist @deptofsophistry I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from yo sophist. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deptofsophistry's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1214723509521387520/7UENeEVp_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">DER SPIEGEL</div> <div style="text-align: center; font-size: 14px;">@derspiegel</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from DER SPIEGEL. | Data | DER SPIEGEL | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 478 | | Short tweets | 6 | | Tweets kept | 2766 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2uv8zr0k/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @derspiegel's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/i3q4xu9o) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/i3q4xu9o/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/derspiegel') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/derspiegel/1638461583796/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/derspiegel
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT DER SPIEGEL @derspiegel I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from DER SPIEGEL. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @derspiegel's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1374540783202734082/5l7zt3RK_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Dev, Bride of Kripkenstein</div> <div style="text-align: center; font-size: 14px;">@dervine7</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Dev, Bride of Kripkenstein. | Data | Dev, Bride of Kripkenstein | | --- | --- | | Tweets downloaded | 3237 | | Retweets | 177 | | Short tweets | 272 | | Tweets kept | 2788 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2j2ia8ja/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dervine7's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/287itbe2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/287itbe2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dervine7') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dervine7/1633413178103/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dervine7
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Dev, Bride of Kripkenstein @dervine7 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Dev, Bride of Kripkenstein. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dervine7's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1156676050576801792/_i8SOLw3_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">nathan, stuck on magic mountain 🤖 AI Bot </div> <div style="font-size: 15px">@derweise91 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@derweise91's tweets](https://twitter.com/derweise91). | Data | Quantity | | --- | --- | | Tweets downloaded | 3233 | | Retweets | 468 | | Short tweets | 408 | | Tweets kept | 2357 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2cgk3c79/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @derweise91's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/yxo7yhz5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/yxo7yhz5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/derweise91') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/derweise91/1616691639404/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/derweise91
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
nathan, stuck on magic mountain AI Bot @derweise91 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @derweise91's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @derweise91's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1372396296699404291/SySu1wAp_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">♦️Moira Perfected♦️ 🤖 AI Bot </div> <div style="font-size: 15px">@destiny_thememe bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@destiny_thememe's tweets](https://twitter.com/destiny_thememe). | Data | Quantity | | --- | --- | | Tweets downloaded | 3242 | | Retweets | 186 | | Short tweets | 772 | | Tweets kept | 2284 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1bbkix40/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @destiny_thememe's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/20xpitr1) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/20xpitr1/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/destiny_thememe') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/destiny_thememe/1616803427645/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/destiny_thememe
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
️Moira Perfected️ AI Bot @destiny\_thememe bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @destiny\_thememe's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @destiny\_thememe's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/2096229264/tdn_stacked_onblack_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1554982611/Nolan_Finley1_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/667024309995626496/OmzBnHNF_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Detroit News Opinion & Nolan Finley & Ingrid Jacques</div> <div style="text-align: center; font-size: 14px;">@detnewsopinion-ingrid_jacques-nolanfinleydn</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Detroit News Opinion & Nolan Finley & Ingrid Jacques. | Data | Detroit News Opinion | Nolan Finley | Ingrid Jacques | | --- | --- | --- | --- | | Tweets downloaded | 3250 | 3249 | 3248 | | Retweets | 530 | 1833 | 1324 | | Short tweets | 0 | 49 | 45 | | Tweets kept | 2720 | 1367 | 1879 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1ktqwqx5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @detnewsopinion-ingrid_jacques-nolanfinleydn's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/vu0trurc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/vu0trurc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/detnewsopinion-ingrid_jacques-nolanfinleydn') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/detnewsopinion-ingrid_jacques-nolanfinleydn/1639365489716/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/detnewsopinion-ingrid_jacques-nolanfinleydn
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Detroit News Opinion & Nolan Finley & Ingrid Jacques @detnewsopinion-ingrid\_jacques-nolanfinleydn I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Detroit News Opinion & Nolan Finley & Ingrid Jacques. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @detnewsopinion-ingrid\_jacques-nolanfinleydn's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/2096229264/tdn_stacked_onblack_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Detroit News Opinion</div> <div style="text-align: center; font-size: 14px;">@detnewsopinion</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Detroit News Opinion. | Data | Detroit News Opinion | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 527 | | Short tweets | 0 | | Tweets kept | 2723 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/gpe3yyem/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @detnewsopinion's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/zezrwsaf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/zezrwsaf/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/detnewsopinion') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/detnewsopinion/1639432824211/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/detnewsopinion
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Detroit News Opinion @detnewsopinion I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Detroit News Opinion. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @detnewsopinion's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1412373998936027142/k2nY1nVc_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1426046688263692288/RzlZFjIP_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1312018147822759937/Z7XnZkhn_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">sad rico & follow me only if you're sad & ...</div> <div style="text-align: center; font-size: 14px;">@detseretninu-dumbricardo-illuminusnumb</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from sad rico & follow me only if you're sad & .... | Data | sad rico | follow me only if you're sad | ... | | --- | --- | --- | --- | | Tweets downloaded | 768 | 3233 | 677 | | Retweets | 0 | 167 | 1 | | Short tweets | 102 | 755 | 285 | | Tweets kept | 666 | 2311 | 391 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/l42hthlz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @detseretninu-dumbricardo-illuminusnumb's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/c1hyp8lf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/c1hyp8lf/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/detseretninu-dumbricardo-illuminusnumb') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/detseretninu-dumbricardo-illuminusnumb/1629841756956/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/detseretninu-dumbricardo-illuminusnumb
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG sad rico & follow me only if you're sad & ... @detseretninu-dumbricardo-illuminusnumb I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from sad rico & follow me only if you're sad & .... Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @detseretninu-dumbricardo-illuminusnumb's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1316517962736701441/OM7fxPiG_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Riley 🇺🇸 🤖 AI Bot </div> <div style="font-size: 15px">@deusdairyland bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@deusdairyland's tweets](https://twitter.com/deusdairyland). | Data | Quantity | | --- | --- | | Tweets downloaded | 908 | | Retweets | 136 | | Short tweets | 219 | | Tweets kept | 553 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/e8tma1u2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @deusdairyland's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/146925y8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/146925y8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/deusdairyland') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/deusdairyland/1616653373811/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/deusdairyland
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Riley 🇺🇸 AI Bot @deusdairyland bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @deusdairyland's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @deusdairyland's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1288646558364372994/jgsTkFCl_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">koob85 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@devkoob bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@devkoob's tweets](https://twitter.com/devkoob). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>712</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>27</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>191</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>494</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/qdwtu190/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @devkoob's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/nweh9viw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/nweh9viw/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/devkoob'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/devkoob/1609552229453/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/devkoob
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">koob85 AI Bot </div> <div style="font-size: 15px; color: #657786">@devkoob bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @devkoob's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>712</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>27</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>191</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>494</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @devkoob's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/devkoob'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @devkoob's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>712</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>27</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>191</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>494</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @devkoob's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/devkoob'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @devkoob's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>712</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>27</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>191</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>494</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @devkoob's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/devkoob'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 429, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1346152108836458496/SNQF5qH9_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">𝐃𝐄𝐕𝐎𝐍 🤖 AI Bot </div> <div style="font-size: 15px">@devon_onearth bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@devon_onearth's tweets](https://twitter.com/devon_onearth). | Data | Quantity | | --- | --- | | Tweets downloaded | 3227 | | Retweets | 449 | | Short tweets | 358 | | Tweets kept | 2420 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/ilmmvbmb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @devon_onearth's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ryyr6zq5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ryyr6zq5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/devon_onearth') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/devon_onearth/1614135166237/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/devon_onearth
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
𝐃𝐄𝐕𝐎𝐍 AI Bot @devon\_onearth bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @devon\_onearth's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @devon\_onearth's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1163117736140124160/u23u5DU4_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/748969887146471424/4BmVTQAv_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/74188698/NeilTysonOriginsA-Crop_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Nigel Thurlow & Ernest Wright, Ph. D. ABD & Neil deGrasse Tyson</div> <div style="text-align: center; font-size: 14px;">@devops_guru-neiltyson-nigelthurlow</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Nigel Thurlow & Ernest Wright, Ph. D. ABD & Neil deGrasse Tyson. | Data | Nigel Thurlow | Ernest Wright, Ph. D. ABD | Neil deGrasse Tyson | | --- | --- | --- | --- | | Tweets downloaded | 1264 | 1933 | 3250 | | Retweets | 648 | 20 | 10 | | Short tweets | 27 | 105 | 79 | | Tweets kept | 589 | 1808 | 3161 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/jc9vah1k/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @devops_guru-neiltyson-nigelthurlow's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2myicem9) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2myicem9/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/devops_guru-neiltyson-nigelthurlow') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/devops_guru-neiltyson-nigelthurlow/1626908139492/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/devops_guru-neiltyson-nigelthurlow
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Nigel Thurlow & Ernest Wright, Ph. D. ABD & Neil deGrasse Tyson @devops\_guru-neiltyson-nigelthurlow I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Nigel Thurlow & Ernest Wright, Ph. D. ABD & Neil deGrasse Tyson. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @devops\_guru-neiltyson-nigelthurlow's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1059117502054195202/0NYJNcaD_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">an actual dog 🤖 AI Bot </div> <div style="font-size: 15px">@devtesla bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@devtesla's tweets](https://twitter.com/devtesla). | Data | Quantity | | --- | --- | | Tweets downloaded | 3164 | | Retweets | 1246 | | Short tweets | 222 | | Tweets kept | 1696 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3mgmikdu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @devtesla's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/8vrkz503) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/8vrkz503/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/devtesla') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/devtesla/1614137580281/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/devtesla
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
an actual dog AI Bot @devtesla bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @devtesla's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @devtesla's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1343584411145666560/uF3JWccD_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Devtrospective 🤖 AI Bot </div> <div style="font-size: 15px">@devtrospective bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@devtrospective's tweets](https://twitter.com/devtrospective). | Data | Quantity | | --- | --- | | Tweets downloaded | 3239 | | Retweets | 562 | | Short tweets | 414 | | Tweets kept | 2263 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3fwfr76h/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @devtrospective's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3moy4evm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3moy4evm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/devtrospective') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/devtrospective/1617905426485/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/devtrospective
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Devtrospective AI Bot @devtrospective bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @devtrospective's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @devtrospective's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1379969696028581890/nIzf87ii_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">dGc 🤖 AI Bot </div> <div style="font-size: 15px">@dgcyt_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dgcyt_'s tweets](https://twitter.com/dgcyt_). | Data | Quantity | | --- | --- | | Tweets downloaded | 907 | | Retweets | 31 | | Short tweets | 353 | | Tweets kept | 523 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2172uj60/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dgcyt_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3goormmx) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3goormmx/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dgcyt_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dgcyt_/1619447035696/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dgcyt_
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
dGc AI Bot @dgcyt\_ bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dgcyt\_'s tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dgcyt\_'s tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/870004265170948097/5tyWgIkd_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Damien Henry 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@dh7net bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dh7net's tweets](https://twitter.com/dh7net). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2463</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>857</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>227</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1379</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/26i29me7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dh7net's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/6w8xyhch) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/6w8xyhch/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dh7net'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dh7net/1602195754110/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dh7net
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Damien Henry AI Bot </div> <div style="font-size: 15px; color: #657786">@dh7net bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @dh7net's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2463</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>857</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>227</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1379</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @dh7net's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dh7net'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @dh7net's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>2463</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>857</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>227</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1379</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @dh7net's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/dh7net'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @dh7net's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>2463</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>857</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>227</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1379</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @dh7net's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/dh7net'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 431, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/911053058104221696/ERPL-sS4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dharmesh Kakadia 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@dharmeshkakadia bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dharmeshkakadia's tweets](https://twitter.com/dharmeshkakadia). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3231</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1284</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>505</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1442</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/igebzms3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dharmeshkakadia's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2rjrmg20) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2rjrmg20/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dharmeshkakadia'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dharmeshkakadia/1602267558589/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dharmeshkakadia
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dharmesh Kakadia AI Bot </div> <div style="font-size: 15px; color: #657786">@dharmeshkakadia bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @dharmeshkakadia's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3231</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1284</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>505</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1442</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @dharmeshkakadia's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dharmeshkakadia'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @dharmeshkakadia's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3231</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1284</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>505</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1442</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @dharmeshkakadia's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/dharmeshkakadia'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @dharmeshkakadia's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3231</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1284</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>505</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1442</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @dharmeshkakadia's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/dharmeshkakadia'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 433, 77, 9, 169, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1245735185787822080/riKefvZr_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Carlos 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@diaz_de_leon bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@diaz_de_leon's tweets](https://twitter.com/diaz_de_leon). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>718</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>167</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>66</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>485</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/w8v47wri/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @diaz_de_leon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/17bl278f) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/17bl278f/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/diaz_de_leon'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/diaz_de_leon/1603509315873/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/diaz_de_leon
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Carlos AI Bot </div> <div style="font-size: 15px; color: #657786">@diaz_de_leon bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @diaz_de_leon's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>718</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>167</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>66</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>485</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @diaz_de_leon's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/diaz_de_leon'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @diaz_de_leon's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>718</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>167</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>66</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>485</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @diaz_de_leon's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/diaz_de_leon'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @diaz_de_leon's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>718</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>167</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>66</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>485</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @diaz_de_leon's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/diaz_de_leon'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 432, 79, 9, 171, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1364103848957120513/Ww-W98d1_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">💾 🤖 AI Bot </div> <div style="font-size: 15px">@digital_languor bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@digital_languor's tweets](https://twitter.com/digital_languor). | Data | Quantity | | --- | --- | | Tweets downloaded | 3200 | | Retweets | 1037 | | Short tweets | 589 | | Tweets kept | 1574 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1z1me0hi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @digital_languor's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/uffw47ml) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/uffw47ml/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/digital_languor') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/digital_languor
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI Bot @digital\_languor bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @digital\_languor's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @digital\_languor's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1395222151477743621/g8GO73EW_400x400.png&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">artchick.eth 🔥</div> <div style="text-align: center; font-size: 14px;">@digitalartchick</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from artchick.eth 🔥. | Data | artchick.eth 🔥 | | --- | --- | | Tweets downloaded | 3248 | | Retweets | 173 | | Short tweets | 580 | | Tweets kept | 2495 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3m0unu0z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @digitalartchick's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3q41dpi6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3q41dpi6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/digitalartchick') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/digitalartchick/1622110282921/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/digitalartchick
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT URL @digitalartchick I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from URL . Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @digitalartchick's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1233161774444113920/7La7pvBs_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Digitalsolver 🤖 AI Bot </div> <div style="font-size: 15px">@digitalsolver1 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@digitalsolver1's tweets](https://twitter.com/digitalsolver1). | Data | Quantity | | --- | --- | | Tweets downloaded | 291 | | Retweets | 112 | | Short tweets | 70 | | Tweets kept | 109 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/23z4oayh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @digitalsolver1's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/237vwzkl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/237vwzkl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/digitalsolver1') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/digitalsolver1/1616653735166/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/digitalsolver1
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Digitalsolver AI Bot @digitalsolver1 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @digitalsolver1's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @digitalsolver1's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1305154265267212292/BSD6EVuq_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">george w kush 🤖 AI Bot </div> <div style="font-size: 15px">@digitalsoyboy bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@digitalsoyboy's tweets](https://twitter.com/digitalsoyboy). | Data | Quantity | | --- | --- | | Tweets downloaded | 3170 | | Retweets | 462 | | Short tweets | 369 | | Tweets kept | 2339 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/26qiav6i/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @digitalsoyboy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3b4m8rf4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3b4m8rf4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/digitalsoyboy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/digitalsoyboy/1617805776990/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/digitalsoyboy
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
george w kush AI Bot @digitalsoyboy bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @digitalsoyboy's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @digitalsoyboy's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1336779061025267715/zRfiUbb7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jess O'Brien 🤖 AI Bot </div> <div style="font-size: 15px">@disabledjess bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@disabledjess's tweets](https://twitter.com/disabledjess). | Data | Quantity | | --- | --- | | Tweets downloaded | 713 | | Retweets | 324 | | Short tweets | 34 | | Tweets kept | 355 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/dt08vg5c/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @disabledjess's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/zxrg63ip) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/zxrg63ip/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/disabledjess') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/disabledjess/1616670355194/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/disabledjess
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jess O'Brien AI Bot @disabledjess bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @disabledjess's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @disabledjess's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1029964613029437440/3_fRmZuH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">luna 🤖 AI Bot </div> <div style="font-size: 15px">@discarddiscord bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@discarddiscord's tweets](https://twitter.com/discarddiscord). | Data | Quantity | | --- | --- | | Tweets downloaded | 1495 | | Retweets | 289 | | Short tweets | 213 | | Tweets kept | 993 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1tvxkurq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @discarddiscord's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2g2xt22m) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2g2xt22m/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/discarddiscord') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/discarddiscord/1614246710317/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/discarddiscord
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
luna AI Bot @discarddiscord bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @discarddiscord's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @discarddiscord's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/983773516880232448/XsKqt1c8_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">andrew blinn 🤖 AI Bot </div> <div style="font-size: 15px">@disconcision bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@disconcision's tweets](https://twitter.com/disconcision). | Data | Quantity | | --- | --- | | Tweets downloaded | 2262 | | Retweets | 453 | | Short tweets | 159 | | Tweets kept | 1650 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/n4jrdsqh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @disconcision's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2f36wyoh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2f36wyoh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/disconcision') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/disconcision/1616643733458/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/disconcision
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
andrew blinn AI Bot @disconcision bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @disconcision's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @disconcision's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1426930394297819137/-zzMnfJo_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/847818629840228354/VXyQHfn0_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/980964012170121217/U6FjPH4H_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">LIAM & wint & Picasso</div> <div style="text-align: center; font-size: 14px;">@discountpicasso-dril-liam_100000</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from LIAM & wint & Picasso. | Data | LIAM | wint | Picasso | | --- | --- | --- | --- | | Tweets downloaded | 1962 | 3226 | 3216 | | Retweets | 135 | 472 | 427 | | Short tweets | 435 | 313 | 421 | | Tweets kept | 1392 | 2441 | 2368 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1w4ekve8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @discountpicasso-dril-liam_100000's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2s4a755y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2s4a755y/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/discountpicasso-dril-liam_100000') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/discountpicasso-dril-liam_100000/1630973640579/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/discountpicasso-dril-liam_100000
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG LIAM & wint & Picasso @discountpicasso-dril-liam\_100000 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from LIAM & wint & Picasso. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @discountpicasso-dril-liam\_100000's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1362528219908476928/DGdEDaOH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">malignant tzara 🤖 AI Bot </div> <div style="font-size: 15px">@divorceenforcer bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@divorceenforcer's tweets](https://twitter.com/divorceenforcer). | Data | Quantity | | --- | --- | | Tweets downloaded | 3148 | | Retweets | 1127 | | Short tweets | 574 | | Tweets kept | 1447 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2b3i6627/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @divorceenforcer's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/13x1aewb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/13x1aewb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/divorceenforcer') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/divorceenforcer/1614097005501/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/divorceenforcer
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
malignant tzara AI Bot @divorceenforcer bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @divorceenforcer's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @divorceenforcer's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1373813806179225602/dfnXLAJp_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">memphis milano enthusiast 🤖 AI Bot </div> <div style="font-size: 15px">@dkulchar bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dkulchar's tweets](https://twitter.com/dkulchar). | Data | Quantity | | --- | --- | | Tweets downloaded | 3236 | | Retweets | 551 | | Short tweets | 569 | | Tweets kept | 2116 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ar0h0xc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dkulchar's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/v3hyz25i) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/v3hyz25i/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dkulchar') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dkulchar
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
memphis milano enthusiast AI Bot @dkulchar bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dkulchar's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dkulchar's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1428106877926260736/xiq2bdMI_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Pirate Queen Grey</div> <div style="text-align: center; font-size: 14px;">@dndomme</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Pirate Queen Grey. | Data | Pirate Queen Grey | | --- | --- | | Tweets downloaded | 3218 | | Retweets | 1329 | | Short tweets | 288 | | Tweets kept | 1601 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3ucgtv6r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dndomme's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1sej7nbm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1sej7nbm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dndomme') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dndomme/1632870893354/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dndomme
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Pirate Queen Grey @dndomme I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Pirate Queen Grey. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dndomme's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1193922190955167744/kFfjfcL4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Matthias Dobbelaere-Welvaert 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@dobbelaerew bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dobbelaerew's tweets](https://twitter.com/dobbelaerew). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3208</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>344</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>350</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2514</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/30tqyeyq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dobbelaerew's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2rthb7d0) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2rthb7d0/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dobbelaerew'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dobbelaerew
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Matthias Dobbelaere-Welvaert AI Bot </div> <div style="font-size: 15px; color: #657786">@dobbelaerew bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @dobbelaerew's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3208</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>344</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>350</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2514</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @dobbelaerew's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/dobbelaerew'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @dobbelaerew's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3208</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>344</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>350</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2514</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @dobbelaerew's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/dobbelaerew'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @dobbelaerew's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3208</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>344</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>350</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2514</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @dobbelaerew's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/dobbelaerew'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 430, 76, 9, 168, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1366152724069326848/1rkDhvsG_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Tim Houk 🤖 AI Bot </div> <div style="font-size: 15px">@dochouk bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dochouk's tweets](https://twitter.com/dochouk). | Data | Quantity | | --- | --- | | Tweets downloaded | 441 | | Retweets | 21 | | Short tweets | 11 | | Tweets kept | 409 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/23t4thvg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dochouk's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/218a58qu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/218a58qu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dochouk') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dochouk/1616781012029/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dochouk
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Tim Houk AI Bot @dochouk bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dochouk's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dochouk's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1250027548785938432/KHyOaVQY_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Emmet Burke 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@doctor_emmet bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@doctor_emmet's tweets](https://twitter.com/doctor_emmet). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2496</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>204</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>176</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2116</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/duj1xqx6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @doctor_emmet's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/yzdnl9ld) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/yzdnl9ld/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/doctor_emmet'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/doctor_emmet/1603833315216/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/doctor_emmet
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Emmet Burke AI Bot </div> <div style="font-size: 15px; color: #657786">@doctor_emmet bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @doctor_emmet's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2496</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>204</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>176</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2116</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @doctor_emmet's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/doctor_emmet'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @doctor_emmet's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>2496</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>204</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>176</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2116</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @doctor_emmet's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/doctor_emmet'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @doctor_emmet's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>2496</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>204</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>176</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2116</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @doctor_emmet's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/doctor_emmet'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 431, 77, 9, 169, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1383905819217911808/AIWNRt5y_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">dodo82.jp</div> <div style="text-align: center; font-size: 14px;">@dodo82j</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from dodo82.jp. | Data | dodo82.jp | | --- | --- | | Tweets downloaded | 217 | | Retweets | 31 | | Short tweets | 26 | | Tweets kept | 160 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2k4cbj1t/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dodo82j's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2qiazp47) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2qiazp47/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dodo82j') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dodo82j/1628669484939/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dodo82j
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT URL @dodo82j I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from URL. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dodo82j's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1400698471385083904/sLTt0UmS_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1046968391389589507/_0r5bQLl_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Elon Musk & Thoughts of Dog®</div> <div style="text-align: center; font-size: 14px;">@dog_feelings-elonmusk</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Elon Musk & Thoughts of Dog®. | Data | Elon Musk | Thoughts of Dog® | | --- | --- | --- | | Tweets downloaded | 400 | 1148 | | Retweets | 32 | 14 | | Short tweets | 123 | 17 | | Tweets kept | 245 | 1117 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2vw0f8wk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dog_feelings-elonmusk's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2o3nweey) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2o3nweey/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dog_feelings-elonmusk') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dog_feelings-elonmusk
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Elon Musk & Thoughts of Dog® @dog\_feelings-elonmusk I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Elon Musk & Thoughts of Dog®. Data: Tweets downloaded, Elon Musk: 400, Thoughts of Dog®: 1148 Data: Retweets, Elon Musk: 32, Thoughts of Dog®: 14 Data: Short tweets, Elon Musk: 123, Thoughts of Dog®: 17 Data: Tweets kept, Elon Musk: 245, Thoughts of Dog®: 1117 Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dog\_feelings-elonmusk's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1046968391389589507/_0r5bQLl_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Thoughts of Dog®</div> <div style="text-align: center; font-size: 14px;">@dog_feelings</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Thoughts of Dog®. | Data | Thoughts of Dog® | | --- | --- | | Tweets downloaded | 1213 | | Retweets | 23 | | Short tweets | 21 | | Tweets kept | 1169 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1u7m68sz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dog_feelings's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/lt738fgm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/lt738fgm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dog_feelings') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/dog_feelings/1668288769350/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dog_feelings
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Thoughts of Dog® @dog\_feelings I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Thoughts of Dog®. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dog\_feelings's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1359563991467646982/9ZPrurHY_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🥞 🤖 AI Bot </div> <div style="font-size: 15px">@dogdick420cum bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dogdick420cum's tweets](https://twitter.com/dogdick420cum). | Data | Quantity | | --- | --- | | Tweets downloaded | 3242 | | Retweets | 148 | | Short tweets | 512 | | Tweets kept | 2582 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/nzltah4f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dogdick420cum's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/29pe6wy0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/29pe6wy0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dogdick420cum') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dogdick420cum/1615429013878/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dogdick420cum
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI Bot @dogdick420cum bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dogdick420cum's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dogdick420cum's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1365105738163621895/vgJ99pHa_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">doge (likes democracy) 🌐 🤖 AI Bot </div> <div style="font-size: 15px">@dogepod_ bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dogepod_'s tweets](https://twitter.com/dogepod_). | Data | Quantity | | --- | --- | | Tweets downloaded | 3237 | | Retweets | 584 | | Short tweets | 525 | | Tweets kept | 2128 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/316ieof3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dogepod_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2fl8hjof) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2fl8hjof/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dogepod_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dogepod_/1617166176912/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dogepod_
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
doge (likes democracy) AI Bot @dogepod\_ bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dogepod\_'s tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dogepod\_'s tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1392101530002657290/MFq0e-VM_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Doityboy</div> <div style="text-align: center; font-size: 14px;">@doityboy</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Doityboy. | Data | Doityboy | | --- | --- | | Tweets downloaded | 3180 | | Retweets | 551 | | Short tweets | 660 | | Tweets kept | 1969 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/17aeg3tr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @doityboy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3qumubtj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3qumubtj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/doityboy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/doityboy/1621603103969/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/doityboy
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Doityboy @doityboy I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Doityboy. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @doityboy's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1487993727918374915/aN2YUrbc_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jean-Emmanuel De La Martinière</div> <div style="text-align: center; font-size: 14px;">@dojacat</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jean-Emmanuel De La Martinière. | Data | Jean-Emmanuel De La Martinière | | --- | --- | | Tweets downloaded | 1569 | | Retweets | 124 | | Short tweets | 322 | | Tweets kept | 1123 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3mc5ryte/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dojacat's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3urxj6el) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3urxj6el/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dojacat') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/dojacat/1644852645931/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dojacat
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jean-Emmanuel De La Martinière @dojacat I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jean-Emmanuel De La Martinière. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dojacat's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1375881363056947208/CpdPn02h_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dom and Cats 😼 🤖 AI Bot </div> <div style="font-size: 15px">@domandcats bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@domandcats's tweets](https://twitter.com/domandcats). | Data | Quantity | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 69 | | Short tweets | 452 | | Tweets kept | 2728 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/24l3uch3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @domandcats's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/nsc2js1f) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/nsc2js1f/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/domandcats') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/domandcats/1616883428985/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/domandcats
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Dom and Cats AI Bot @domandcats bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @domandcats's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @domandcats's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1146161910448054273/b1HpVczo_400x400.png&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Domonic</div> <div style="text-align: center; font-size: 14px;">@domonic_m</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Domonic. | Data | Domonic | | --- | --- | | Tweets downloaded | 502 | | Retweets | 70 | | Short tweets | 69 | | Tweets kept | 363 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1q7f1cu6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @domonic_m's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/no8iew6j) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/no8iew6j/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/domonic_m') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/domonic_m/1629517784951/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/domonic_m
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Domonic @domonic\_m I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Domonic. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @domonic\_m's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/937797480007241729/JyzkRlnB_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Donald Clark 🤖 AI Bot </div> <div style="font-size: 15px">@donaldclark bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@donaldclark's tweets](https://twitter.com/donaldclark). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 4 | | Short tweets | 195 | | Tweets kept | 3051 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2vaujq4r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @donaldclark's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2of8k8rc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2of8k8rc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/donaldclark') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/donaldclark/1617223633702/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/donaldclark
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Donald Clark AI Bot @donaldclark bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @donaldclark's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @donaldclark's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/610933861866835969/wRgRnVOt_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Donald Hoffman 🤖 AI Bot </div> <div style="font-size: 15px">@donalddhoffman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@donalddhoffman's tweets](https://twitter.com/donalddhoffman). | Data | Quantity | | --- | --- | | Tweets downloaded | 236 | | Retweets | 11 | | Short tweets | 45 | | Tweets kept | 180 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1wzfrcs4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @donalddhoffman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2zo2lld7) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2zo2lld7/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/donalddhoffman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/donalddhoffman
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Donald Hoffman AI Bot @donalddhoffman bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @donalddhoffman's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @donalddhoffman's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1245523276128010240/kEFAcj1B_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Donkey Kong</div> <div style="text-align: center; font-size: 14px;">@donkeykongape</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Donkey Kong. | Data | Donkey Kong | | --- | --- | | Tweets downloaded | 3200 | | Retweets | 72 | | Short tweets | 1081 | | Tweets kept | 2047 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1pcwumgk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @donkeykongape's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/253exk8q) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/253exk8q/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/donkeykongape') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/donkeykongape/1625293730159/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/donkeykongape
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Donkey Kong @donkeykongape I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Donkey Kong. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @donkeykongape's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1143276012601401345/VivOmTnV_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Me on the left 🤖 AI Bot </div> <div style="font-size: 15px">@dontgender bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dontgender's tweets](https://twitter.com/dontgender). | Data | Quantity | | --- | --- | | Tweets downloaded | 2340 | | Retweets | 1023 | | Short tweets | 311 | | Tweets kept | 1006 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/34s4a2i7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dontgender's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/sl8zueoq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/sl8zueoq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dontgender') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dontgender/1614140992709/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dontgender
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Me on the left AI Bot @dontgender bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dontgender's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dontgender's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1312200651238072321/54qAE_Rr_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Don Winslow 🤖 AI Bot </div> <div style="font-size: 15px">@donwinslow bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@donwinslow's tweets](https://twitter.com/donwinslow). | Data | Quantity | | --- | --- | | Tweets downloaded | 3219 | | Retweets | 1841 | | Short tweets | 169 | | Tweets kept | 1209 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2jonj6ue/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @donwinslow's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1jogue52) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1jogue52/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/donwinslow') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/donwinslow/1612878348095/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/donwinslow
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Don Winslow AI Bot @donwinslow bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @donwinslow's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @donwinslow's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1326642076298203136/_aPBjlCI_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Thistle Bnuuy 🤖 AI Bot </div> <div style="font-size: 15px">@dorkyfolf bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dorkyfolf's tweets](https://twitter.com/dorkyfolf). | Data | Quantity | | --- | --- | | Tweets downloaded | 2881 | | Retweets | 1665 | | Short tweets | 255 | | Tweets kept | 961 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2m0yq9vg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dorkyfolf's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2wv3osjp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2wv3osjp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dorkyfolf') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dorkyfolf/1617804114723/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dorkyfolf
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Thistle Bnuuy AI Bot @dorkyfolf bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dorkyfolf's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dorkyfolf's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1356184155881672705/giFRkA6Z_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Carlos Santana - DotCSV 🧠🤖 🤖 AI Bot </div> <div style="font-size: 15px">@dotcsv bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dotcsv's tweets](https://twitter.com/dotcsv). | Data | Quantity | | --- | --- | | Tweets downloaded | 3219 | | Retweets | 1037 | | Short tweets | 238 | | Tweets kept | 1944 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/36v1c13g/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dotcsv's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3g04fco4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3g04fco4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dotcsv') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/dotcsv/1619159083139/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dotcsv
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Carlos Santana - DotCSV AI Bot @dotcsv bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dotcsv's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dotcsv's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1363217665586835460/RU5F44Dj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">daniel 🤖 AI Bot </div> <div style="font-size: 15px">@downgrad3d bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@downgrad3d's tweets](https://twitter.com/downgrad3d). | Data | Quantity | | --- | --- | | Tweets downloaded | 441 | | Retweets | 138 | | Short tweets | 82 | | Tweets kept | 221 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/6eqzlox6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @downgrad3d's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1fsmvsit) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1fsmvsit/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/downgrad3d') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/downgrad3d/1614303163871/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/downgrad3d
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
daniel AI Bot @downgrad3d bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @downgrad3d's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @downgrad3d's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1435032258868482049/AySjv2ON_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Donovan</div> <div style="text-align: center; font-size: 14px;">@dp_crazy_gamer</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Donovan. | Data | Donovan | | --- | --- | | Tweets downloaded | 3214 | | Retweets | 763 | | Short tweets | 824 | | Tweets kept | 1627 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2pvd0ays/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dp_crazy_gamer's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/14bwewth) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/14bwewth/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dp_crazy_gamer') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/dp_crazy_gamer/1643299090939/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dp_crazy_gamer
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Donovan @dp\_crazy\_gamer I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Donovan. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dp\_crazy\_gamer's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1238524243996000257/JtmbZZL-_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">David Pakman 🤖 AI Bot </div> <div style="font-size: 15px">@dpakman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dpakman's tweets](https://twitter.com/dpakman). | Data | Quantity | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 49 | | Short tweets | 418 | | Tweets kept | 2783 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/el9fwqxw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dpakman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2esg5gfa) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2esg5gfa/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dpakman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dpakman
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
David Pakman AI Bot @dpakman bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dpakman's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dpakman's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1223387148486836224/8HoUiYpU_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🏳️‍🌈Dragonogon🏳️‍⚧️🐲 🤖 AI Bot </div> <div style="font-size: 15px">@dragonogon bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@dragonogon's tweets](https://twitter.com/dragonogon). | Data | Quantity | | --- | --- | | Tweets downloaded | 3235 | | Retweets | 988 | | Short tweets | 346 | | Tweets kept | 1901 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/egunl2pl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dragonogon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1lgtnz96) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1lgtnz96/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dragonogon') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dragonogon
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
️‍Dragonogon️‍️ AI Bot @dragonogon bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @dragonogon's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dragonogon's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/563843814725402624/Vb8k670S_400x400.png&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Drizzy</div> <div style="text-align: center; font-size: 14px;">@drake</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Drizzy. | Data | Drizzy | | --- | --- | | Tweets downloaded | 1766 | | Retweets | 396 | | Short tweets | 151 | | Tweets kept | 1219 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/178j75wb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @drake's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1gvmezqz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1gvmezqz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/drake') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/drake/1631662344811/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/drake
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "has_space", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
AI BOT Drizzy @drake I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Drizzy. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @drake's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n" ]
[ 58 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us \n" ]
[ 0.01490766927599907, -0.029707439243793488, -0.005551214329898357, 0.019200731068849564, 0.13538123667240143, 0.031960584223270416, 0.07854003459215164, 0.15176640450954437, -0.03478027135133743, 0.020825274288654327, 0.17780765891075134, 0.13369141519069672, -0.025615064427256584, 0.10110418498516083, -0.04495277628302574, -0.27562591433525085, 0.05076512694358826, 0.04804648086428642, -0.03660755604505539, 0.13914404809474945, 0.08130049705505371, -0.04002818092703819, 0.09946437180042267, -0.02348330244421959, -0.18448196351528168, 0.03603345900774002, 0.03874115273356438, -0.10113959014415741, 0.1245182529091835, 0.042210761457681656, 0.097016341984272, 0.02144498936831951, -0.08173582702875137, -0.09949471801519394, 0.04076211526989937, 0.0524214468896389, -0.06747427582740784, 0.07371947914361954, 0.06548287719488144, -0.09104428440332413, 0.15018822252750397, 0.032346583902835846, -0.015613866969943047, 0.05642572045326233, -0.17452795803546906, -0.05158021301031113, -0.038718461990356445, 0.007089211139827967, 0.026932736858725548, 0.0765126645565033, -0.03151388466358185, 0.1801837533712387, -0.09969375282526016, 0.08080597966909409, 0.19449612498283386, -0.3136744201183319, -0.008555608801543713, 0.09032562375068665, 0.10929754376411438, 0.04874800145626068, -0.027296993881464005, 0.0915885642170906, 0.06193246319890022, 0.01960965059697628, 0.03833269700407982, -0.04861060902476311, -0.10319411009550095, 0.05647158622741699, -0.09057577699422836, -0.06054073944687843, 0.21635963022708893, -0.04008340463042259, 0.06982671469449997, -0.06175680458545685, -0.10815948247909546, -0.04076464846730232, -0.0019450898980721831, 0.007722846698015928, -0.04665110260248184, 0.07648039609193802, -0.018055662512779236, -0.07031384110450745, -0.16104750335216522, 0.004024024587124586, -0.19634990394115448, 0.15040342509746552, -0.007091521751135588, 0.03826533630490303, -0.18292304873466492, 0.11239926517009735, -0.00463539082556963, -0.09352986514568329, 0.05768691375851631, -0.09079932421445847, 0.06515228003263474, 0.012864203192293644, -0.07714717835187912, -0.01476553175598383, 0.07529455423355103, 0.14459288120269775, -0.03202451393008232, -0.014661530964076519, 0.005731707438826561, 0.08028950542211533, 0.0649947077035904, 0.0529506653547287, -0.05044480413198471, -0.04604479670524597, 0.023471293970942497, -0.12279047816991806, 0.005323461256921291, -0.08722265064716339, -0.11850544810295105, -0.06324991583824158, 0.02913680486381054, 0.0409526452422142, 0.05413410812616348, 0.11646701395511627, -0.03819722682237625, 0.003125704126432538, 0.0790749341249466, -0.04754907265305519, 0.014819944277405739, -0.019563961774110794, 0.019939353689551353, 0.10169259458780289, -0.004264532588422298, 0.02739645354449749, -0.08098349720239639, 0.053036682307720184, -0.10909318923950195, -0.01619906909763813, -0.01401914469897747, -0.08360559493303299, 0.03495157137513161, -0.13044656813144684, 0.003459199797362089, -0.1794617623090744, -0.08515556156635284, 0.004473025444895029, -0.02669714391231537, -0.029448989778757095, -0.07044726610183716, -0.000012087725735909771, -0.03782980516552925, 0.09037511795759201, -0.049331095069646835, 0.02187872678041458, -0.05999310687184334, 0.10105309635400772, -0.06390658020973206, 0.10026112198829651, -0.14347022771835327, 0.062101077288389206, -0.13453714549541473, -0.007185438182204962, -0.056844428181648254, 0.06125464290380478, 0.003428247757256031, 0.14510339498519897, -0.0019743351731449366, -0.024058373644948006, -0.11893454194068909, 0.07660667598247528, -0.013276024721562862, 0.20594745874404907, -0.08867699652910233, -0.12363981455564499, 0.18977078795433044, -0.05639764666557312, -0.13151882588863373, 0.12418028712272644, 0.014429234899580479, 0.0632154569029808, 0.06213442608714104, 0.22679246962070465, 0.019381923601031303, -0.012931653298437595, 0.022296493873000145, 0.09503678232431412, -0.14555561542510986, -0.009661003015935421, 0.00841028243303299, -0.0034040703903883696, -0.08606491982936859, 0.03532338887453079, 0.11260384321212769, 0.09308808296918869, -0.06410733610391617, -0.02130906470119953, -0.04973118007183075, -0.002825164934620261, 0.08984078466892242, -0.004783526994287968, 0.10731708258390427, -0.1087048277258873, -0.0696249008178711, -0.030250849202275276, -0.0012943379115313292, 0.019239740446209908, 0.047808386385440826, -0.02344650961458683, 0.12562713027000427, -0.007734235376119614, 0.03894360736012459, -0.1425095796585083, -0.08136747777462006, -0.0352344736456871, 0.1538124978542328, 0.043824564665555954, 0.12064811587333679, 0.05824866518378258, -0.05751828849315643, -0.01182011142373085, -0.0009749550954438746, 0.1436033844947815, -0.01271585002541542, -0.0852496474981308, -0.05857346951961517, 0.08234002441167831, -0.07005385309457779, 0.025747623294591904, -0.044775284826755524, 0.034807976335287094, 0.07559926062822342, 0.12458331137895584, -0.012196341529488564, 0.03711283579468727, -0.0114806042984128, -0.00011358146002748981, -0.0896436795592308, -0.017758425325155258, 0.08426769077777863, -0.01031328085809946, -0.06891145557165146, 0.2548580765724182, -0.20196770131587982, 0.2299574315547943, 0.2296006977558136, -0.24926680326461792, -0.031021486967802048, -0.033769410103559494, -0.053823258727788925, 0.017695831134915352, 0.037970591336488724, -0.06758655607700348, 0.04836089164018631, -0.04798879474401474, 0.14743368327617645, -0.039952509105205536, -0.067154660820961, 0.011684614233672619, -0.06448059529066086, -0.05935867503285408, 0.06313008069992065, 0.042497556656599045, -0.12196247279644012, 0.19907300174236298, 0.256646990776062, 0.05731065943837166, 0.19537882506847382, 0.018505526706576347, -0.008647909387946129, 0.05532584711909294, -0.06024787575006485, -0.05939318984746933, -0.047756556421518326, -0.17080695927143097, -0.04192587733268738, 0.08264796435832977, 0.03818074241280556, 0.10142038762569427, -0.10767655074596405, -0.07215311378240585, 0.009001186117529869, 0.015298610553145409, -0.0071457610465586185, 0.13598360121250153, 0.0518062487244606, 0.13914446532726288, -0.005930832587182522, 0.025881925597786903, 0.06802397966384888, 0.02908273972570896, -0.08634071797132492, 0.1456945538520813, -0.1443340927362442, -0.3761582672595978, -0.1337488889694214, -0.09692523628473282, -0.01728254184126854, 0.04180695861577988, 0.12187701463699341, -0.12770286202430725, 0.0015871572541072965, -0.038519375026226044, 0.11093265563249588, -0.08619660884141922, 0.05064225569367409, -0.09312444925308228, 0.013121986761689186, -0.05371590703725815, -0.09208092838525772, -0.03968694806098938, -0.025510141626000404, -0.09566790610551834, 0.17000380158424377, -0.06683114916086197, 0.06858081370592117, 0.18506819009780884, 0.00719145592302084, 0.0289844311773777, -0.048802170902490616, 0.19252337515354156, -0.11087619513273239, 0.03966445475816727, 0.16040337085723877, 0.017642250284552574, 0.0870000347495079, 0.1063152626156807, -0.012242939323186874, -0.0729413852095604, 0.045604314655065536, 0.00838512647897005, -0.1192161962389946, -0.1628645360469818, -0.12800830602645874, -0.09412717819213867, 0.11832976341247559, 0.04896165430545807, 0.06800009310245514, 0.15884871780872345, 0.07476164400577545, -0.026399539783596992, -0.014989175833761692, -0.030628371983766556, 0.06698833405971527, 0.1810096949338913, -0.03410731256008148, 0.14575767517089844, -0.058808114379644394, -0.11332095414400101, 0.13757702708244324, 0.02930915355682373, 0.041703663766384125, 0.02182352915406227, -0.00909164547920227, 0.002967006294056773, 0.1275966912508011, 0.12905162572860718, 0.09085892140865326, -0.009210709482431412, -0.02600487507879734, -0.043374914675951004, -0.013961076736450195, -0.004321942571550608, 0.0544881634414196, 0.04976500943303108, -0.17056837677955627, -0.05252770707011223, -0.15036118030548096, 0.10635104030370712, 0.0658026859164238, 0.1097705140709877, -0.1943172961473465, -0.002813685918226838, 0.07878942042589188, -0.032336506992578506, -0.10688291490077972, 0.0711626410484314, 0.08084181696176529, -0.10172833502292633, 0.061861153692007065, -0.006532561499625444, 0.10406243056058884, 0.023742493242025375, 0.09878388792276382, -0.05866028368473053, -0.06148262321949005, -0.016518592834472656, 0.0939461812376976, -0.2926773428916931, 0.18179792165756226, -0.02907240390777588, -0.09844435751438141, -0.07134795933961868, -0.027840720489621162, 0.02770385891199112, 0.08060616999864578, 0.07026281952857971, 0.03338824585080147, 0.004331306088715792, -0.096750907599926, -0.025082198902964592, 0.01906721480190754, 0.12366755306720734, -0.05625581368803978, -0.01350511983036995, -0.041342347860336304, 0.02173074521124363, 0.00881099235266447, 0.06707418709993362, 0.021282022818922997, -0.16175130009651184, 0.07081238180398941, 0.03874411806464195, 0.03632965683937073, 0.02809896133840084, -0.02992626093327999, -0.1548270583152771, 0.1699383705854416, 0.02149343490600586, -0.06355534493923187, -0.12458667904138565, -0.04415702819824219, 0.06474760919809341, -0.04559854418039322, 0.039427343755960464, -0.0629853680729866, -0.001072446582838893, -0.07396374642848969, -0.19678819179534912, 0.1352648138999939, -0.06877507269382477, -0.07956159859895706, -0.03609750047326088, 0.17941854894161224, -0.07844837009906769, 0.018584884703159332, 0.007694936357438564, 0.047691553831100464, -0.14645716547966003, -0.11046357452869415, 0.06859224289655685, -0.04709484800696373, 0.036693572998046875, 0.01622764952480793, -0.02127574197947979, 0.005549534223973751, -0.012371892109513283, -0.0025305403396487236, 0.27376097440719604, 0.22965525090694427, -0.0802733525633812, 0.17710179090499878, 0.0630764365196228, -0.0507618673145771, -0.326980859041214, -0.09341348707675934, -0.14117297530174255, -0.009823985397815704, 0.023037875071167946, -0.13106577098369598, 0.05048951879143715, 0.029267780482769012, -0.014271927997469902, 0.11641538888216019, -0.20954635739326477, -0.09825599938631058, 0.09716890752315521, -0.06865759193897247, 0.4034286141395569, -0.1301247775554657, -0.07252515107393265, -0.03537401184439659, -0.14370368421077728, 0.1951674520969391, -0.06888531148433685, 0.09445398300886154, -0.02521556057035923, 0.13606955111026764, 0.05328543111681938, -0.02234726957976818, 0.11178411543369293, -0.01972258649766445, 0.0011309145484119654, -0.13865596055984497, -0.0604546032845974, 0.08001494407653809, -0.007717274595052004, 0.009300438687205315, -0.10272051393985748, 0.02493043802678585, -0.1687590479850769, -0.0011172585655003786, -0.11747178435325623, 0.09020119160413742, 0.021906770765781403, -0.06782494485378265, -0.050800591707229614, -0.0397600457072258, -0.009170935489237309, -0.01781470514833927, 0.14266571402549744, -0.05449723079800606, 0.1958875209093094, 0.07874593138694763, 0.06835102289915085, -0.1398908495903015, 0.05150968208909035, -0.027329890057444572, -0.0722222700715065, 0.07193705439567566, -0.1763947606086731, 0.04665200039744377, 0.08747349679470062, -0.03685600683093071, 0.04361190274357796, 0.09054525941610336, 0.0016616116045042872, 0.015259911306202412, 0.17542323470115662, -0.26249444484710693, -0.009265073575079441, -0.05851370841264725, -0.07754334062337875, 0.09399785101413727, 0.05022705718874931, 0.17302869260311127, 0.01883954182267189, -0.042071253061294556, 0.01570643112063408, 0.01509455218911171, -0.059037789702415466, 0.045513592660427094, 0.02684464305639267, 0.008570663630962372, -0.1221366673707962, 0.05156322568655014, 0.023951835930347443, -0.13511396944522858, 0.028608962893486023, 0.18224509060382843, -0.11025255918502808, -0.12385176122188568, -0.05918734520673752, 0.014604943804442883, -0.1147887259721756, 0.022507676854729652, -0.0024671610444784164, -0.09835006296634674, 0.062000639736652374, 0.13879112899303436, 0.06231173500418663, 0.12207856774330139, -0.016749046742916107, -0.009889025241136551, -0.02596008963882923, -0.03651546314358711, 0.030208537355065346, 0.03168511390686035, -0.0861181765794754, 0.11346250027418137, -0.02512839064002037, 0.1417820155620575, -0.10664749890565872, -0.05201685056090355, -0.14300444722175598, -0.032158151268959045, -0.08575893938541412, -0.11310356855392456, -0.07854323089122772, -0.07035478204488754, 0.009079275652766228, -0.06144321709871292, -0.04742341861128807, -0.07850945740938187, -0.1071249470114708, 0.0065527972765266895, -0.04037778079509735, 0.04593488946557045, -0.08211231976747513, -0.0025119397323578596, 0.11637649685144424, -0.022535819560289383, 0.16392095386981964, 0.11117493361234665, -0.09679865092039108, 0.07616551965475082, -0.12686336040496826, -0.10472826659679413, 0.09312791377305984, -0.011926734820008278, 0.03812364861369133, 0.11054535210132599, 0.005196572281420231, 0.026585882529616356, 0.05407439172267914, 0.06366553902626038, 0.03967318683862686, -0.10743927955627441, 0.0886315256357193, -0.020753242075443268, -0.1556011289358139, -0.045647621154785156, -0.0805148333311081, 0.03534896299242973, 0.01974404975771904, 0.10271792113780975, -0.034986238926649094, 0.0667109340429306, -0.08902619034051895, 0.024180470034480095, -0.000016180018064915203, -0.17815378308296204, -0.01795545034110546, -0.044853758066892624, 0.03495967015624046, 0.017773056402802467, 0.24157072603702545, 0.06085042282938957, -0.06760483235120773, 0.04657021537423134, 0.12547647953033447, 0.015325317159295082, 0.0012204478261992335, 0.15906299650669098, 0.09267544746398926, -0.07720284163951874, -0.1355191469192505, 0.06737678498029709, 0.019282789900898933, -0.045628830790519714, 0.09535399079322815, 0.009011821821331978, 0.021824736148118973, 0.0726994201540947, -0.01307250652462244, 0.0016560814110562205, -0.10145208984613419, -0.11841893196105957, -0.038609180599451065, 0.056700821965932846, -0.03823903948068619, 0.08270184695720673, 0.1624554693698883, -0.01529943011701107, 0.038163430988788605, -0.03936288133263588, -0.01488962396979332, -0.12554575502872467, -0.1426725685596466, -0.06450272351503372, -0.16302600502967834, -0.007763232104480267, -0.08333444595336914, 0.06433288753032684, 0.07949802279472351, 0.059826187789440155, -0.04269853234291077, 0.07651268690824509, 0.041022781282663345, -0.10034794360399246, 0.07510355859994888, -0.02059151791036129, 0.05323530361056328, -0.021385202184319496, -0.019656827673316002, -0.10213424265384674, 0.05417940020561218, -0.020723329856991768, 0.0416044183075428, -0.05111391469836235, 0.009476713836193085, -0.17171601951122284, -0.1145220696926117, -0.06269708275794983, 0.05981360003352165, -0.05057463422417641, 0.05945824086666107, 0.01168898493051529, 0.011022492311894894, 0.016294511035084724, 0.24577881395816803, -0.06521986424922943, -0.04237864539027214, -0.039796773344278336, 0.1401647925376892, -0.010960960760712624, 0.07910837978124619, -0.04971875622868538, -0.026874123141169548, -0.11801496148109436, 0.31808507442474365, 0.3330231308937073, -0.09461544454097748, 0.042824193835258484, -0.011730561032891273, 0.03255566582083702, 0.12393459677696228, 0.12263859808444977, 0.1047649085521698, 0.23149316012859344, -0.07330518960952759, -0.04662523791193962, -0.0235530287027359, -0.011487046256661415, -0.07061693072319031, 0.09144455194473267, 0.024042300879955292, -0.06604029983282089, -0.04112962633371353, 0.061292614787817, -0.1968756765127182, 0.09409318119287491, -0.08292321115732193, -0.2009436935186386, -0.043561115860939026, 0.03980926424264908, 0.11663878709077835, -0.0058410209603607655, 0.12327833473682404, 0.0021221975330263376, -0.06928802281618118, 0.0007765711052343249, 0.021609993651509285, -0.21508292853832245, 0.024919532239437103, 0.07233009487390518, -0.12181923538446426, -0.003256353549659252, -0.03614840656518936, -0.0016090048011392355, 0.08110456168651581, 0.041294027119874954, -0.03950197249650955, 0.026108399033546448, 0.004547862336039543, -0.03431681916117668, -0.023192770779132843, 0.04429425671696663, 0.04074658825993538, -0.15583069622516632, 0.0852603018283844, -0.15913386642932892, 0.045099467039108276, -0.033806633204221725, 0.0012128398520871997, -0.010232296772301197, -0.0017561162821948528, -0.040829166769981384, 0.08149882405996323, 0.07333235442638397, -0.022076698020100594, -0.0061705452390015125, -0.06384896486997604, -0.055786989629268646, -0.027778856456279755, -0.08865642547607422, -0.09847963601350784, -0.12417391687631607, -0.10101866722106934, 0.09827395528554916, -0.022276965901255608, -0.17602279782295227, 0.008451612666249275, -0.09465660154819489, 0.06178250536322594, -0.14477479457855225, 0.11634210497140884, 0.08996126055717468, 0.0033104352187365294, 0.006666821893304586, -0.021679235622286797, 0.07054676860570908, 0.1143367812037468, -0.09097250550985336, -0.06963858008384705 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1418286808509583361/wr1RfH41_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">izzy 😼 (anti-ableism arc)</div> <div style="text-align: center; font-size: 14px;">@drbelbel0</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from izzy 😼 (anti-ableism arc). | Data | izzy 😼 (anti-ableism arc) | | --- | --- | | Tweets downloaded | 340 | | Retweets | 174 | | Short tweets | 57 | | Tweets kept | 109 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/y28lpi1f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @drbelbel0's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/362qf1n5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/362qf1n5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/drbelbel0') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/drbelbel0/1627246944704/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/drbelbel0
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT izzy (anti-ableism arc) @drbelbel0 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from izzy (anti-ableism arc). Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @drbelbel0's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/2684024563/9660a122cc7fa5a3d348e16614ebb7a7_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dr. Brian May 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@drbrianmay bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@drbrianmay's tweets](https://twitter.com/drbrianmay). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3232</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>448</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>60</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2724</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3ee80djp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @drbrianmay's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/1zzbge0u) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/1zzbge0u/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/drbrianmay'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/drbrianmay
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Dr. Brian May AI Bot </div> <div style="font-size: 15px; color: #657786">@drbrianmay bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @drbrianmay's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3232</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>448</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>60</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2724</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @drbrianmay's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/drbrianmay'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @drbrianmay's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3232</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>448</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>60</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2724</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @drbrianmay's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/drbrianmay'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @drbrianmay's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3232</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>448</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>60</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2724</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @drbrianmay's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/drbrianmay'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 430, 76, 9, 168, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1414914440231800840/vRSW6t9i_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Andrew Maragni 🇺🇸</div> <div style="text-align: center; font-size: 14px;">@drew106</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Andrew Maragni 🇺🇸. | Data | Andrew Maragni 🇺🇸 | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 786 | | Short tweets | 176 | | Tweets kept | 2282 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/pfjcjeb0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @drew106's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3e1rv18u) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3e1rv18u/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/drew106') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/drew106/1627055915329/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/drew106
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Andrew Maragni 🇺🇸 @drew106 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Andrew Maragni 🇺🇸. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @drew106's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1373793141506117641/gvV-BWCF_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">drewcoffman.eth 𝕚𝕤 𝕠𝕟𝕝𝕚𝕟𝕖 🟢</div> <div style="text-align: center; font-size: 14px;">@drewcoffman</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from drewcoffman.eth 𝕚𝕤 𝕠𝕟𝕝𝕚𝕟𝕖 🟢. | Data | drewcoffman.eth 𝕚𝕤 𝕠𝕟𝕝𝕚𝕟𝕖 🟢 | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 43 | | Short tweets | 540 | | Tweets kept | 2667 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2kh4r1d8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @drewcoffman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ln9svwl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ln9svwl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/drewcoffman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/drewcoffman/1627699166305/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/drewcoffman
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT URL 𝕚𝕤 𝕠𝕟𝕝𝕚𝕟𝕖 🟢 @drewcoffman I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from URL 𝕚𝕤 𝕠𝕟𝕝𝕚𝕟𝕖 🟢. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @drewcoffman's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/847818629840228354/VXyQHfn0_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1468502340634296326/gbl8-ltv_400x400.png&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1374924360780242944/-Q8NfgEr_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">wint & Jril & wintbot_neo</div> <div style="text-align: center; font-size: 14px;">@dril-drilbot_neo-jril_bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from wint & Jril & wintbot_neo. | Data | wint | Jril | wintbot_neo | | --- | --- | --- | --- | | Tweets downloaded | 3228 | 113 | 3241 | | Retweets | 475 | 0 | 315 | | Short tweets | 305 | 0 | 453 | | Tweets kept | 2448 | 113 | 2473 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/27nmrlyy/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dril-drilbot_neo-jril_bot's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/i64hq9wb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/i64hq9wb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dril-drilbot_neo-jril_bot') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/dril-drilbot_neo-jril_bot/1643968320729/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/dril-drilbot_neo-jril_bot
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG wint & Jril & wintbot\_neo @dril-drilbot\_neo-jril\_bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from wint & Jril & wintbot\_neo. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @dril-drilbot\_neo-jril\_bot's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]