sha
null
last_modified
null
library_name
stringclasses
154 values
text
stringlengths
1
900k
metadata
stringlengths
2
348k
pipeline_tag
stringclasses
45 values
id
stringlengths
5
122
tags
listlengths
1
1.84k
created_at
stringlengths
25
25
arxiv
listlengths
0
201
languages
listlengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
listlengths
0
722
processed_texts
listlengths
1
723
tokens_length
listlengths
1
723
input_texts
listlengths
1
61
embeddings
listlengths
768
768
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1436764466868273159/z-bXRwzQ_400x400.png&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Scientist</div> <div style="text-align: center; font-size: 14px;">@ihavesexhourly</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Scientist. | Data | Scientist | | --- | --- | | Tweets downloaded | 3205 | | Retweets | 841 | | Short tweets | 621 | | Tweets kept | 1743 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2qyzrpd8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ihavesexhourly's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/m2o7mtpw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/m2o7mtpw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ihavesexhourly') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ihavesexhourly/1631841194880/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ihavesexhourly
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Scientist @ihavesexhourly I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Scientist. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ihavesexhourly's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1448859687449862147/frVD6mW3_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">juju 💰</div> <div style="text-align: center; font-size: 14px;">@ihyjuju</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from juju 💰. | Data | juju 💰 | | --- | --- | | Tweets downloaded | 3248 | | Retweets | 1 | | Short tweets | 478 | | Tweets kept | 2769 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3n82hqbg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ihyjuju's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1t6rclcz) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1t6rclcz/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ihyjuju') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/ihyjuju/1640741515385/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ihyjuju
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT juju @ihyjuju I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from juju . Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ihyjuju's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1179205017020063744/WnOlftVe_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">¯\_(ツ)_/¯</div> <div style="text-align: center; font-size: 14px;">@ijustbluemyself</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from ¯\_(ツ)_/¯. | Data | ¯\_(ツ)_/¯ | | --- | --- | | Tweets downloaded | 3224 | | Retweets | 250 | | Short tweets | 982 | | Tweets kept | 1992 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/qgmk16ox/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ijustbluemyself's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2yq2ve7k) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2yq2ve7k/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ijustbluemyself') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ijustbluemyself/1625279746808/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ijustbluemyself
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT ¯\\_(ツ)\_/¯ @ijustbluemyself I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from ¯\_(ツ)\_/¯. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ijustbluemyself's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/2429657879/vq8ux7qvn4ljg9oh7zzu_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Iván Díaz 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@ildiazm bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ildiazm's tweets](https://twitter.com/ildiazm). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>574</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>72</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>12</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>490</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3cn99ecb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ildiazm's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/167ssmah) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/167ssmah/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ildiazm'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ildiazm
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Iván Díaz AI Bot </div> <div style="font-size: 15px; color: #657786">@ildiazm bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @ildiazm's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>574</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>72</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>12</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>490</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @ildiazm's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ildiazm'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @ildiazm's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>574</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>72</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>12</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>490</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @ildiazm's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/ildiazm'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @ildiazm's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>574</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>72</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>12</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>490</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @ildiazm's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/ildiazm'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 427, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1365435709981532165/v6Dv3nvt_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Banon 🤖 AI Bot </div> <div style="font-size: 15px">@ilike_birds bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ilike_birds's tweets](https://twitter.com/ilike_birds). | Data | Quantity | | --- | --- | | Tweets downloaded | 1017 | | Retweets | 39 | | Short tweets | 337 | | Tweets kept | 641 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/21wt3y4x/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ilike_birds's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2g2q8s1w) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2g2q8s1w/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ilike_birds') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ilike_birds/1617813434047/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ilike_birds
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Banon AI Bot @ilike\_birds bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @ilike\_birds's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ilike\_birds's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1370460588334268419/h0Y0-Ny__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">IllinoisJones 🤖 AI Bot </div> <div style="font-size: 15px">@iljone bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@iljone's tweets](https://twitter.com/iljone). | Data | Quantity | | --- | --- | | Tweets downloaded | 337 | | Retweets | 6 | | Short tweets | 99 | | Tweets kept | 232 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3l85ym1p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @iljone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3uzsj96o) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3uzsj96o/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/iljone') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/iljone/1616774453050/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/iljone
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
IllinoisJones AI Bot @iljone bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @iljone's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @iljone's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1247219435338756099/wUX8KxD4_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Josh Cherry 🌱 🤖 AI Bot </div> <div style="font-size: 15px">@ilovelucilius bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ilovelucilius's tweets](https://twitter.com/ilovelucilius). | Data | Quantity | | --- | --- | | Tweets downloaded | 331 | | Retweets | 42 | | Short tweets | 9 | | Tweets kept | 280 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ztd1uk0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ilovelucilius's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1gbbrvx4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1gbbrvx4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ilovelucilius') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ilovelucilius/1616644679483/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ilovelucilius
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Josh Cherry AI Bot @ilovelucilius bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @ilovelucilius's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ilovelucilius's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1425684733753626624/q521TgTG_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Ilya Sutskever</div> <div style="text-align: center; font-size: 14px;">@ilyasut</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Ilya Sutskever. | Data | Ilya Sutskever | | --- | --- | | Tweets downloaded | 852 | | Retweets | 474 | | Short tweets | 39 | | Tweets kept | 339 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/y41t187f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ilyasut's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2slwglzj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2slwglzj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ilyasut') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/ilyasut/1653408370188/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ilyasut
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Ilya Sutskever @ilyasut I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Ilya Sutskever. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ilyasut's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1356115782606852103/lawv78Xb_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Brown Timothée Chalamet 🤖 AI Bot </div> <div style="font-size: 15px">@imaginary_bi bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@imaginary_bi's tweets](https://twitter.com/imaginary_bi). | Data | Quantity | | --- | --- | | Tweets downloaded | 1204 | | Retweets | 189 | | Short tweets | 72 | | Tweets kept | 943 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3srr04nu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imaginary_bi's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3773072h) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3773072h/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/imaginary_bi') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imaginary_bi/1614117239005/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imaginary_bi
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Brown Timothée Chalamet AI Bot @imaginary\_bi bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @imaginary\_bi's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @imaginary\_bi's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1354295229654958081/FUhOGuYV_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ace 🤖 AI Bot </div> <div style="font-size: 15px">@imcummingonline bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@imcummingonline's tweets](https://twitter.com/imcummingonline). | Data | Quantity | | --- | --- | | Tweets downloaded | 914 | | Retweets | 88 | | Short tweets | 218 | | Tweets kept | 608 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2yh36yxx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imcummingonline's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3nnnr0u8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3nnnr0u8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/imcummingonline') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imcummingonline/1617770513198/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imcummingonline
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Ace AI Bot @imcummingonline bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @imcummingonline's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @imcummingonline's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1397711387380617219/Hzreffrt_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Contra</div> <div style="text-align: center; font-size: 14px;">@imgrimevil</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Contra. | Data | Contra | | --- | --- | | Tweets downloaded | 3238 | | Retweets | 669 | | Short tweets | 582 | | Tweets kept | 1987 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1kn7qqp8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imgrimevil's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/fjaoumhd) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/fjaoumhd/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/imgrimevil') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imgrimevil/1627251988335/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imgrimevil
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Contra @imgrimevil I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Contra. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @imgrimevil's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1289653820071522304/cdikNvkG_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jack Rudd 🇹🇹 🏳️‍⚧️</div> <div style="text-align: center; font-size: 14px;">@imjackrudd</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jack Rudd 🇹🇹 🏳️‍⚧️. | Data | Jack Rudd 🇹🇹 🏳️‍⚧️ | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 55 | | Short tweets | 327 | | Tweets kept | 2864 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3g5589wt/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imjackrudd's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/eyywpszu) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/eyywpszu/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/imjackrudd') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imjackrudd/1632871893609/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imjackrudd
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jack Rudd 🇹🇹 ️‍️ @imjackrudd I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jack Rudd 🇹🇹 ️‍️. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @imjackrudd's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1363904099922677762/CZEhI56N_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Luca 🤖 AI Bot </div> <div style="font-size: 15px">@imjustluca bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@imjustluca's tweets](https://twitter.com/imjustluca). | Data | Quantity | | --- | --- | | Tweets downloaded | 3218 | | Retweets | 379 | | Short tweets | 261 | | Tweets kept | 2578 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ap66ek7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imjustluca's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1qfi3jgq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1qfi3jgq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/imjustluca') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imjustluca/1614160603911/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imjustluca
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Luca AI Bot @imjustluca bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @imjustluca's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @imjustluca's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1298909619721388035/1v9WJxu7_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jaelynn 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@imjustuhgrl bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@imjustuhgrl's tweets](https://twitter.com/imjustuhgrl). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3236</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>15</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>512</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2709</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/4phdk9xl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imjustuhgrl's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/22432rm3) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/22432rm3/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/imjustuhgrl'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imjustuhgrl/1601318938681/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imjustuhgrl
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jaelynn AI Bot </div> <div style="font-size: 15px; color: #657786">@imjustuhgrl bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @imjustuhgrl's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3236</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>15</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>512</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2709</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @imjustuhgrl's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/imjustuhgrl'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @imjustuhgrl's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3236</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>15</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>512</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2709</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @imjustuhgrl's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/imjustuhgrl'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @imjustuhgrl's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3236</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>15</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>512</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2709</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @imjustuhgrl's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/imjustuhgrl'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 432, 77, 9, 169, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1371650533111529472/0wqXcosZ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">gabagol cawfee 🤖 AI Bot </div> <div style="font-size: 15px">@immarxistonline bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@immarxistonline's tweets](https://twitter.com/immarxistonline). | Data | Quantity | | --- | --- | | Tweets downloaded | 3226 | | Retweets | 340 | | Short tweets | 732 | | Tweets kept | 2154 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3f3uoi57/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @immarxistonline's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1tynoxd5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1tynoxd5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/immarxistonline') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/immarxistonline/1617769482090/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/immarxistonline
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
gabagol cawfee AI Bot @immarxistonline bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @immarxistonline's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @immarxistonline's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1276507373260214275/RZ9iZEmJ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">The Immersive Kind 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@immersivekind bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@immersivekind's tweets](https://twitter.com/immersivekind). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>435</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>171</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>4</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>260</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1bh9dpmh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @immersivekind's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ovh81f8f) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ovh81f8f/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/immersivekind'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/immersivekind
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">The Immersive Kind AI Bot </div> <div style="font-size: 15px; color: #657786">@immersivekind bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @immersivekind's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>435</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>171</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>4</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>260</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @immersivekind's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/immersivekind'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @immersivekind's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>435</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>171</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>4</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>260</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @immersivekind's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/immersivekind'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @immersivekind's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>435</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>171</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>4</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>260</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @immersivekind's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/immersivekind'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 428, 76, 9, 168, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1297793420094607360/3hhcM4L2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🇨🇦📎 🤖 AI Bot </div> <div style="font-size: 15px">@imnotseto bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@imnotseto's tweets](https://twitter.com/imnotseto). | Data | Quantity | | --- | --- | | Tweets downloaded | 342 | | Retweets | 15 | | Short tweets | 50 | | Tweets kept | 277 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/33rcvwm6/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imnotseto's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/35wya1gp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/35wya1gp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/imnotseto') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imnotseto/1614213422097/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imnotseto
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
🇨🇦 AI Bot @imnotseto bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @imnotseto's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @imnotseto's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1335360624646295552/kaAOgc0s_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">imo !!! 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@imogenloisfox bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@imogenloisfox's tweets](https://twitter.com/imogenloisfox). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2473</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>883</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>219</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1371</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2dm16o1m/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imogenloisfox's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2ectjmyn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2ectjmyn/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/imogenloisfox'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imogenloisfox/1608309297782/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imogenloisfox
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">imo !!! AI Bot </div> <div style="font-size: 15px; color: #657786">@imogenloisfox bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @imogenloisfox's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>2473</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>883</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>219</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1371</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @imogenloisfox's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/imogenloisfox'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @imogenloisfox's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>2473</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>883</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>219</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1371</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @imogenloisfox's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/imogenloisfox'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @imogenloisfox's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>2473</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>883</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>219</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1371</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @imogenloisfox's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/imogenloisfox'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 433, 77, 9, 169, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1319291252718759938/q2NdOiAb_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Robert Yi 🐳</div> <div style="text-align: center; font-size: 14px;">@imrobertyi</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Robert Yi 🐳. | Data | Robert Yi 🐳 | | --- | --- | | Tweets downloaded | 1353 | | Retweets | 61 | | Short tweets | 130 | | Tweets kept | 1162 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3cmckdcz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imrobertyi's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/fi24mvdb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/fi24mvdb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/imrobertyi') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imrobertyi/1631652694998/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imrobertyi
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Robert Yi @imrobertyi I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Robert Yi . Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @imrobertyi's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1327751099336552449/R5srCw96_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">carson 🤖 AI Bot </div> <div style="font-size: 15px">@imscribbledude bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@imscribbledude's tweets](https://twitter.com/imscribbledude). | Data | Quantity | | --- | --- | | Tweets downloaded | 2286 | | Retweets | 458 | | Short tweets | 252 | | Tweets kept | 1576 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2eyhb2dr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @imscribbledude's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1t0me7sm) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1t0me7sm/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/imscribbledude') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/imscribbledude/1614102197502/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/imscribbledude
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
carson AI Bot @imscribbledude bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @imscribbledude's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @imscribbledude's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/706481670090690563/LXli4ovR_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Matthew Incantalupo 🤖 AI Bot </div> <div style="font-size: 15px">@incantalupo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@incantalupo's tweets](https://twitter.com/incantalupo). | Data | Quantity | | --- | --- | | Tweets downloaded | 1738 | | Retweets | 36 | | Short tweets | 61 | | Tweets kept | 1641 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/12pm0jbi/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @incantalupo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3vnxuapw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3vnxuapw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/incantalupo') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/incantalupo/1616711390839/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/incantalupo
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Matthew Incantalupo AI Bot @incantalupo bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @incantalupo's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @incantalupo's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/581592941124153346/5nfUJyU2_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/561419401145376768/7OIwxUCC_400x400.jpeg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1190256978007904257/TsXH7_nP_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Charmeuse & Sad Socrates & Vincent Van Gone</div> <div style="text-align: center; font-size: 14px;">@incharmuese-sadsocrates-vvangone</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Charmeuse & Sad Socrates & Vincent Van Gone. | Data | Charmeuse | Sad Socrates | Vincent Van Gone | | --- | --- | --- | --- | | Tweets downloaded | 3238 | 3197 | 3233 | | Retweets | 1165 | 40 | 1054 | | Short tweets | 248 | 161 | 266 | | Tweets kept | 1825 | 2996 | 1913 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/13ochftk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @incharmuese-sadsocrates-vvangone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/173sb7ob) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/173sb7ob/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/incharmuese-sadsocrates-vvangone') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/incharmuese-sadsocrates-vvangone/1635521727120/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/incharmuese-sadsocrates-vvangone
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Charmeuse & Sad Socrates & Vincent Van Gone @incharmuese-sadsocrates-vvangone I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Charmeuse & Sad Socrates & Vincent Van Gone. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @incharmuese-sadsocrates-vvangone's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1357846260934352899/EWTPeA8__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">indi 🍔 🤖 AI Bot </div> <div style="font-size: 15px">@indiburger bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@indiburger's tweets](https://twitter.com/indiburger). | Data | Quantity | | --- | --- | | Tweets downloaded | 3104 | | Retweets | 712 | | Short tweets | 372 | | Tweets kept | 2020 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3emok4ku/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @indiburger's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/rpeuqv5y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/rpeuqv5y/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/indiburger') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/indiburger/1614096163881/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/indiburger
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
indi AI Bot @indiburger bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @indiburger's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @indiburger's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1298927466074046464/rYBDt889_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Алексей Целищев 🤖 AI Bot </div> <div style="font-size: 15px">@infernocav bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@infernocav's tweets](https://twitter.com/infernocav). | Data | Quantity | | --- | --- | | Tweets downloaded | 129 | | Retweets | 8 | | Short tweets | 16 | | Tweets kept | 105 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3fbjwvhg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @infernocav's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/cxwbz9yp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/cxwbz9yp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/infernocav') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/infernocav/1616656950369/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/infernocav
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Алексей Целищев AI Bot @infernocav bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @infernocav's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @infernocav's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1346117150621257728/FeSBhcrN_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">RIP TO THE VILLIAN 🤖 AI Bot </div> <div style="font-size: 15px">@infinitedodge bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@infinitedodge's tweets](https://twitter.com/infinitedodge). | Data | Quantity | | --- | --- | | Tweets downloaded | 2774 | | Retweets | 1524 | | Short tweets | 123 | | Tweets kept | 1127 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/qerz9onf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @infinitedodge's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2gw3u22x) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2gw3u22x/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/infinitedodge') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/infinitedodge/1614135156383/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/infinitedodge
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
RIP TO THE VILLIAN AI Bot @infinitedodge bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @infinitedodge's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @infinitedodge's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/542050723521773568/XfMb_pUx_400x400.jpeg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">infosec_dominatrix 🤖 AI Bot </div> <div style="font-size: 15px">@infosec_domme bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@infosec_domme's tweets](https://twitter.com/infosec_domme). | Data | Quantity | | --- | --- | | Tweets downloaded | 542 | | Retweets | 64 | | Short tweets | 57 | | Tweets kept | 421 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1s8mwvc2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @infosec_domme's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3qb5k1m0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3qb5k1m0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/infosec_domme') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/infosec_domme/1616349133246/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/infosec_domme
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
infosec\_dominatrix AI Bot @infosec\_domme bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @infosec\_domme's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @infosec\_domme's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1122970155204730882/KMOOjnGR_400x400.png&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Ingrida Šimonytė</div> <div style="text-align: center; font-size: 14px;">@ingridasimonyte</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Ingrida Šimonytė. | Data | Ingrida Šimonytė | | --- | --- | | Tweets downloaded | 283 | | Retweets | 17 | | Short tweets | 10 | | Tweets kept | 256 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1vod103u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ingridasimonyte's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2xm136ry) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2xm136ry/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ingridasimonyte') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ingridasimonyte/1620506733305/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ingridasimonyte
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Ingrida Šimonytė @ingridasimonyte I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Ingrida Šimonytė. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ingridasimonyte's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1346164467130884098/8w4pqpbj_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">borghisattva 🤖 AI Bot </div> <div style="font-size: 15px">@ingroupist bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ingroupist's tweets](https://twitter.com/ingroupist). | Data | Quantity | | --- | --- | | Tweets downloaded | 154 | | Retweets | 0 | | Short tweets | 0 | | Tweets kept | 154 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/fl5icybp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ingroupist's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/218gj8om) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/218gj8om/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ingroupist') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ingroupist/1616685344882/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ingroupist
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
borghisattva AI Bot @ingroupist bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @ingroupist's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ingroupist's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1403059590481289216/dqLJI_-U_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">INHALING MY SHEET OF SUN</div> <div style="text-align: center; font-size: 14px;">@inhalingmy</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from INHALING MY SHEET OF SUN. | Data | INHALING MY SHEET OF SUN | | --- | --- | | Tweets downloaded | 2647 | | Retweets | 0 | | Short tweets | 838 | | Tweets kept | 1809 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/r1ksmwi2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @inhalingmy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2e1lrid4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2e1lrid4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/inhalingmy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/inhalingmy/1631843035059/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/inhalingmy
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT INHALING MY SHEET OF SUN @inhalingmy I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from INHALING MY SHEET OF SUN. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @inhalingmy's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1062301628475289600/sCq-edVm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Su Başak 🤖 AI Bot </div> <div style="font-size: 15px">@inmidonot bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@inmidonot's tweets](https://twitter.com/inmidonot). | Data | Quantity | | --- | --- | | Tweets downloaded | 344 | | Retweets | 6 | | Short tweets | 15 | | Tweets kept | 323 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2mghgkpx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @inmidonot's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/12l0mm4t) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/12l0mm4t/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/inmidonot') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/inmidonot/1616937673978/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/inmidonot
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Su Başak AI Bot @inmidonot bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @inmidonot's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @inmidonot's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1342654008520011783/ELNBkoe__400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Insert 🚩🦮 🤖 AI Bot </div> <div style="font-size: 15px">@insert_name27 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@insert_name27's tweets](https://twitter.com/insert_name27). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 111 | | Short tweets | 491 | | Tweets kept | 2644 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3m2d1hmb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @insert_name27's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ajldnpxe) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ajldnpxe/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/insert_name27') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/insert_name27/1617820538616/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/insert_name27
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Insert AI Bot @insert\_name27 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @insert\_name27's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @insert\_name27's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1418652395119153153/dvMUbHmM_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1449364913890074627/SNmSlTYD_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1450840619132260357/r9rdJtIp_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Pratham & Insha & Savio Martin ⚡️</div> <div style="text-align: center; font-size: 14px;">@insharamin-prathkum-saviomartin7</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Pratham & Insha & Savio Martin ⚡️. | Data | Pratham | Insha | Savio Martin ⚡️ | | --- | --- | --- | --- | | Tweets downloaded | 3246 | 3249 | 3249 | | Retweets | 461 | 24 | 118 | | Short tweets | 317 | 457 | 201 | | Tweets kept | 2468 | 2768 | 2930 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/o7jfvmhp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @insharamin-prathkum-saviomartin7's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/p2md0wva) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/p2md0wva/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/insharamin-prathkum-saviomartin7') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/insharamin-prathkum-saviomartin7/1637920907734/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/insharamin-prathkum-saviomartin7
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Pratham & Insha & Savio Martin ️ @insharamin-prathkum-saviomartin7 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Pratham & Insha & Savio Martin ️. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @insharamin-prathkum-saviomartin7's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1276993788821540872/edbR86Jw_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Insufficiently Outraged 🤖 AI Bot </div> <div style="font-size: 15px">@insufficientout bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@insufficientout's tweets](https://twitter.com/insufficientout). | Data | Quantity | | --- | --- | | Tweets downloaded | 784 | | Retweets | 26 | | Short tweets | 68 | | Tweets kept | 690 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/5cu9fjjj/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @insufficientout's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2c1v17ew) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2c1v17ew/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/insufficientout') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/insufficientout/1616757946042/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/insufficientout
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Insufficiently Outraged AI Bot @insufficientout bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @insufficientout's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @insufficientout's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374410731358019593/eBVT1vhW_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Spark Of Inquiry 🤖 AI Bot </div> <div style="font-size: 15px">@interro__bang bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@interro__bang's tweets](https://twitter.com/interro__bang). | Data | Quantity | | --- | --- | | Tweets downloaded | 114 | | Retweets | 2 | | Short tweets | 19 | | Tweets kept | 93 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1k112d2n/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @interro__bang's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/uppi8vz0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/uppi8vz0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/interro__bang') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/interro__bang/1616611219490/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/interro__bang
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Spark Of Inquiry AI Bot @interro\_\_bang bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @interro\_\_bang's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @interro\_\_bang's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/608132742224568320/x3yrArdT_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Electronic Intifada 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@intifada bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@intifada's tweets](https://twitter.com/intifada). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3241</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>6</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3235</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1qmm4ybr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @intifada's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/8f4jzilg) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/8f4jzilg/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/intifada'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/intifada/1603110719648/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/intifada
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Electronic Intifada AI Bot </div> <div style="font-size: 15px; color: #657786">@intifada bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @intifada's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3241</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>6</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>3235</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @intifada's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/intifada'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @intifada's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3241</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>6</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>3235</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @intifada's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/intifada'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @intifada's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3241</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>6</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>3235</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @intifada's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/intifada'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 428, 74, 9, 166, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/922432805426130944/Zv5SABlH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Carlos E. Perez 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@intuitmachine bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@intuitmachine's tweets](https://twitter.com/intuitmachine). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3216</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>222</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>82</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2912</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3a25w014/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @intuitmachine's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/g4lfqgv1) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/g4lfqgv1/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/intuitmachine'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/intuitmachine
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Carlos E. Perez AI Bot </div> <div style="font-size: 15px; color: #657786">@intuitmachine bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @intuitmachine's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3216</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>222</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>82</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2912</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @intuitmachine's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/intuitmachine'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @intuitmachine's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3216</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>222</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>82</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2912</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @intuitmachine's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/intuitmachine'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @intuitmachine's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3216</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>222</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>82</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2912</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @intuitmachine's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/intuitmachine'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 429, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1306312443388334081/oABG6C1L_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1393211665001459713/gobLbDve_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Steve | Millionaire Habits & Investor's Theory</div> <div style="text-align: center; font-size: 14px;">@investorstheory-steveonspeed</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Steve | Millionaire Habits & Investor's Theory. | Data | Steve | Millionaire Habits | Investor's Theory | | --- | --- | --- | | Tweets downloaded | 3245 | 3250 | | Retweets | 330 | 168 | | Short tweets | 320 | 660 | | Tweets kept | 2595 | 2422 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2yk0pwia/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @investorstheory-steveonspeed's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3hmaq3cx) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3hmaq3cx/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/investorstheory-steveonspeed') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/investorstheory-steveonspeed/1622080865723/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/investorstheory-steveonspeed
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;URL </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;URL </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> AI CYBORG </div> <div style="text-align: center; font-size: 16px; font-weight: 800">Steve | Millionaire Habits & Investor's Theory</div> <div style="text-align: center; font-size: 14px;">@investorstheory-steveonspeed</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on tweets from Steve | Millionaire Habits & Investor's Theory. | Data | Steve | Millionaire Habits | Investor's Theory | | --- | --- | --- | | Tweets downloaded | 3245 | 3250 | | Retweets | 330 | 168 | | Short tweets | 320 | 660 | | Tweets kept | 2595 | 2422 | Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @investorstheory-steveonspeed's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ## Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on tweets from Steve | Millionaire Habits & Investor's Theory.\n\n| Data | Steve | Millionaire Habits | Investor's Theory |\n| --- | --- | --- |\n| Tweets downloaded | 3245 | 3250 |\n| Retweets | 330 | 168 |\n| Short tweets | 320 | 660 |\n| Tweets kept | 2595 | 2422 |\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @investorstheory-steveonspeed's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## How to use\n\nYou can use this model directly with a pipeline for text generation:", "## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n![Follow](URL\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on tweets from Steve | Millionaire Habits & Investor's Theory.\n\n| Data | Steve | Millionaire Habits | Investor's Theory |\n| --- | --- | --- |\n| Tweets downloaded | 3245 | 3250 |\n| Retweets | 330 | 168 |\n| Short tweets | 320 | 660 |\n| Tweets kept | 2595 | 2422 |\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @investorstheory-steveonspeed's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## How to use\n\nYou can use this model directly with a pipeline for text generation:", "## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n![Follow](URL\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL" ]
[ 54, 34, 140, 81, 18, 47, 38 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.## Training data\n\nThe model was trained on tweets from Steve | Millionaire Habits & Investor's Theory.\n\n| Data | Steve | Millionaire Habits | Investor's Theory |\n| --- | --- | --- |\n| Tweets downloaded | 3245 | 3250 |\n| Retweets | 330 | 168 |\n| Short tweets | 320 | 660 |\n| Tweets kept | 2595 | 2422 |\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @investorstheory-steveonspeed's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Boris Dayma*\n\n![Follow](URL\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL" ]
[ -0.052989304065704346, 0.08916506916284561, -0.004419144708663225, 0.09043741226196289, 0.1083783432841301, -0.007582562975585461, 0.06893041729927063, 0.09898775070905685, -0.02937331236898899, 0.08591943234205246, 0.05302633345127106, 0.03539879992604256, 0.07792805135250092, 0.21041496098041534, 0.05860988423228264, -0.2243071347475052, -0.018036343157291412, -0.08374708145856857, -0.060649096965789795, 0.1440672129392624, 0.08525625616312027, -0.10436011105775833, 0.07313457876443863, -0.03595481812953949, -0.08801836520433426, 0.015516337938606739, -0.00789494626224041, -0.06365523487329483, 0.09429264813661575, 0.03502355515956879, 0.0401947908103466, 0.03156924247741699, 0.06762062758207321, -0.14867615699768066, 0.0355309434235096, 0.12152627110481262, 0.017064623534679413, 0.08751048147678375, 0.06919489800930023, -0.03894539549946785, 0.09063345938920975, -0.03884192183613777, 0.07835343480110168, 0.060460515320301056, -0.16027404367923737, -0.10701882094144821, -0.1328098177909851, 0.10169785469770432, 0.0799751952290535, 0.07723895460367203, -0.03450120985507965, 0.11634459346532822, -0.00928403902798891, 0.05652424320578575, 0.20837126672267914, -0.2684208154678345, -0.026290984824299812, 0.005831540562212467, 0.03329497575759888, 0.08932434767484665, -0.09376996755599976, 0.015247294679284096, 0.010862540453672409, 0.04213628172874451, 0.10836418718099594, -0.032753050327301025, 0.16729755699634552, -0.0008024487178772688, -0.12804538011550903, -0.03991679102182388, 0.08106973022222519, 0.023769009858369827, -0.06585616618394852, -0.12210909277200699, -0.04892691224813461, -0.11982914805412292, 0.0023805515374988317, -0.010295217856764793, 0.010639394633471966, -0.002515623578801751, -0.08963298797607422, -0.05844397842884064, -0.06383997946977615, -0.01548083871603012, -0.029277751222252846, 0.039109718054533005, 0.022346310317516327, 0.050934553146362305, -0.08855133503675461, 0.19777405261993408, 0.09671946614980698, -0.12924939393997192, -0.04588429257273674, -0.08634117245674133, -0.06985079497098923, -0.03898536413908005, 0.05981061980128288, 0.06120807304978371, 0.009158635511994362, 0.17286257445812225, -0.05783538520336151, 0.0220355074852705, -0.010433271527290344, 0.023298978805541992, 0.07758135348558426, 0.11597010493278503, -0.12410051375627518, -0.1399560421705246, 0.07057264447212219, 0.010335602797567844, -0.08600886166095734, -0.031232908368110657, 0.0350458100438118, 0.04390290752053261, 0.028396492823958397, 0.09051144123077393, 0.04844624921679497, 0.0704406350851059, -0.03214345499873161, -0.033704426139593124, 0.05921149626374245, -0.16225466132164001, 0.04008309170603752, 0.008172139525413513, -0.057386476546525955, 0.011081917211413383, 0.019095320254564285, -0.034202080219984055, -0.12311988323926926, 0.09580188989639282, -0.07701897621154785, -0.03464551270008087, -0.07666946947574615, -0.08108671009540558, -0.014784829691052437, 0.00312315602786839, -0.049902431666851044, -0.08111504465341568, -0.12886518239974976, -0.05291995406150818, 0.009698580019176006, -0.05914118140935898, 0.0025610369630157948, 0.011842235922813416, -0.010244475677609444, -0.013709339313209057, -0.02261597476899624, 0.05859375, -0.047651879489421844, 0.06063396856188774, -0.06898961216211319, 0.036868102848529816, 0.08316938579082489, 0.030309706926345825, -0.11094183474779129, 0.058339923620224, -0.12048536539077759, 0.1365068256855011, -0.05024100840091705, 0.035672612488269806, -0.13679690659046173, -0.10253995656967163, -0.02520100027322769, -0.02112671732902527, 0.040340982377529144, 0.1297067403793335, -0.12768243253231049, -0.04530799388885498, 0.2213025689125061, -0.047691065818071365, -0.04343007504940033, 0.07333916425704956, -0.07065284252166748, 0.0533071793615818, 0.10693598538637161, 0.04618204012513161, 0.07680147141218185, -0.04681945592164993, -0.01211059931665659, -0.04882727935910225, -0.0882418304681778, 0.15933258831501007, 0.031953588128089905, -0.006406752392649651, 0.05628741532564163, -0.0050581395626068115, -0.04104152321815491, -0.02487977407872677, -0.06877341866493225, -0.023803533986210823, 0.03178491070866585, -0.01814117096364498, -0.02113770693540573, -0.04930160194635391, -0.021274831146001816, -0.06435571610927582, -0.11409439891576767, 0.04040296748280525, 0.06073778495192528, -0.026250042021274567, 0.004169937688857317, -0.10028700530529022, 0.025468619540333748, 0.017063580453395844, 0.015305357985198498, -0.178205668926239, -0.012490861117839813, 0.020830063149333, -0.016594173386693, 0.11015969514846802, 0.02012212947010994, 0.04715324938297272, 0.06912171095609665, -0.018357079476118088, -0.013229161500930786, -0.05433805659413338, 0.0063519906252622604, -0.04072216525673866, -0.1683414876461029, -0.029937326908111572, -0.05997743085026741, 0.037364933639764786, -0.06088420748710632, -0.00808629672974348, 0.10493353009223938, 0.12366846203804016, 0.03880120813846588, -0.06041504442691803, 0.0013197468360885978, -0.026988966390490532, 0.016672173514962196, -0.09788823127746582, -0.015148468315601349, -0.006333990953862667, -0.015870770439505577, 0.08909320831298828, -0.14213889837265015, -0.002049144357442856, 0.12126440554857254, 0.03963726386427879, -0.11910710483789444, 0.012986632063984871, -0.0771399587392807, -0.0231649037450552, -0.08672869205474854, -0.03430957347154617, 0.2256869077682495, 0.014674829319119453, 0.0867898091673851, -0.061869967728853226, -0.07462425529956818, -0.0073344348929822445, -0.008982311934232712, -0.010026243515312672, 0.087124764919281, -0.006464546546339989, -0.2698087990283966, 0.07479051500558853, 0.028826385736465454, 0.13824377954006195, 0.16315653920173645, -0.0029970721807330847, -0.06695203483104706, -0.032619722187519073, -0.033416543155908585, 0.005802924279123545, 0.08924129605293274, -0.0062330374494194984, 0.004655384458601475, 0.03811429813504219, 0.04714808240532875, 0.020373456180095673, -0.09825146943330765, 0.01608816161751747, 0.04403383284807205, -0.023869195953011513, -0.037085529416799545, -0.009816492907702923, 0.02020338736474514, 0.1316375583410263, 0.05429300665855408, 0.06202879548072815, -0.05238322913646698, -0.0539250485599041, -0.11938300728797913, 0.1467030793428421, -0.10468559712171555, -0.18933191895484924, -0.11749635636806488, -0.06351390480995178, 0.06910248100757599, -0.00768295768648386, 0.045106712728738785, -0.019471973180770874, -0.04921979457139969, -0.0827677845954895, 0.0869961678981781, 0.024721870198845863, -0.01768754981458187, 0.06458830088376999, 0.011638129130005836, 0.0046905758790671825, -0.135602205991745, -0.03039228729903698, 0.018244417384266853, -0.080750472843647, 0.024707404896616936, 0.07145805656909943, 0.035535991191864014, 0.1174878403544426, -0.01675434038043022, 0.03493957221508026, -0.03404371440410614, 0.25116145610809326, -0.1215091273188591, 0.09314968436956406, 0.11997117847204208, -0.016901426017284393, 0.07042604684829712, 0.08965841680765152, 0.038231972604990005, -0.06656842678785324, 0.06479968130588531, 0.024226607754826546, -0.06990349292755127, -0.20648036897182465, -0.00936998799443245, -0.01908639445900917, 0.026460248976945877, 0.07322593033313751, 0.06489157676696777, -0.05764591693878174, -0.023523900657892227, -0.0769086480140686, -0.0062874858267605305, 0.07022640109062195, 0.07013538479804993, -0.0872521623969078, 0.009390470571815968, 0.07535520941019058, -0.03396061435341835, 0.015831612050533295, 0.11889608204364777, -0.024624811485409737, 0.21047694981098175, -0.07051458209753036, 0.07948901504278183, 0.05566627159714699, 0.052202459424734116, 0.055916644632816315, 0.059386447072029114, -0.02500285394489765, 0.05378492921590805, -0.01074699591845274, -0.07641328126192093, 0.035835716873407364, 0.01125375833362341, -0.017222072929143906, 0.005967709235846996, -0.00730024604126811, -0.039492275565862656, 0.10534211248159409, 0.23087207973003387, 0.05775519460439682, -0.17519570887088776, -0.08873029053211212, 0.031236249953508377, -0.08053030073642731, -0.055573198944330215, -0.008842334151268005, 0.03811894729733467, -0.1949753612279892, 0.017999950796365738, -0.05114198848605156, 0.11962147802114487, -0.07709644734859467, 0.012439757585525513, 0.10749142616987228, 0.08619602769613266, -0.07899165153503418, 0.04434216767549515, -0.24680903553962708, 0.11013886332511902, -0.01102240476757288, 0.07777253538370132, -0.02879684790968895, 0.008886335417628288, 0.026704968884587288, -0.0094302948564291, 0.11153002083301544, 0.03592047840356827, 0.051589932292699814, -0.10015059262514114, -0.07446770370006561, -0.012067660689353943, 0.09390833228826523, -0.1183331087231636, 0.10512881726026535, -0.007939628325402737, 0.008318115957081318, -0.03519733250141144, -0.08971958607435226, -0.07297129929065704, -0.10944608598947525, 0.06340623646974564, -0.19625963270664215, 0.0030863804277032614, -0.06792999058961868, -0.02503998763859272, -0.012894310057163239, 0.15817613899707794, -0.11704467982053757, -0.09089470654726028, -0.12777945399284363, 0.09659420698881149, 0.08496972918510437, -0.0824943333864212, 0.038728028535842896, 0.016985975205898285, 0.052487559616565704, 0.027058042585849762, -0.14093580842018127, 0.07105810940265656, -0.05748305842280388, -0.2371671348810196, -0.04118601232767105, 0.14633475244045258, 0.10631594806909561, 0.030503828078508377, 0.002247454831376672, 0.02609691396355629, -0.021836595609784126, -0.145431250333786, 0.028825320303440094, 0.027989579364657402, 0.051245372742414474, 0.031656231731176376, 0.04448184370994568, -0.027624206617474556, -0.14918576180934906, -0.013965552672743797, 0.03594591096043587, 0.22355996072292328, -0.09384208172559738, 0.1370287984609604, 0.06609729677438736, -0.0681387186050415, -0.22591851651668549, 0.005633611232042313, -0.009255130775272846, 0.03029523603618145, 0.055267106741666794, -0.13126599788665771, 0.05852552130818367, 0.047347817569971085, -0.0002779600617941469, 0.11027131974697113, -0.3252066671848297, -0.13494446873664856, 0.03428773209452629, 0.035268791019916534, -0.038222603499889374, -0.08042152225971222, -0.04427460953593254, -0.005585879553109407, -0.20891410112380981, 0.050946589559316635, -0.12527208030223846, 0.08347887545824051, 0.015411674976348877, 0.037753645330667496, 0.005201255902647972, -0.049019135534763336, 0.09074291586875916, -0.029244497418403625, 0.056314535439014435, -0.08247431367635727, -0.026436053216457367, 0.0908411368727684, -0.044319260865449905, 0.0222313292324543, 0.016381753608584404, 0.0876498743891716, -0.07840775698423386, -0.03619411215186119, -0.08051945269107819, 0.050664715468883514, -0.07035980373620987, -0.08880096673965454, -0.06777377426624298, 0.09486933797597885, 0.07044056802988052, -0.06733478605747223, -0.08894386142492294, -0.02969818376004696, 0.0003982977650593966, 0.15707992017269135, 0.06351868808269501, 0.050534117966890335, -0.0975470095872879, -0.01741054840385914, -0.006876840256154537, 0.02614343911409378, -0.08768174052238464, 0.04625527933239937, 0.10867264121770859, 0.03777482360601425, 0.10647685080766678, 0.030540257692337036, -0.16134537756443024, 0.018196115270256996, 0.06106216832995415, -0.14903709292411804, -0.1317768543958664, -0.018291708081960678, -0.02262127213180065, -0.03588847815990448, 0.0034325835295021534, 0.1129283607006073, -0.03770812228322029, -0.04402100294828415, 0.02054159715771675, 0.06710349023342133, -0.04610805585980415, 0.1601894199848175, 0.018310213461518288, 0.06658171117305756, -0.08915625512599945, 0.10055315494537354, 0.07114717364311218, 0.019296351820230484, -0.0035935731139034033, 0.14947155117988586, -0.13708506524562836, -0.027989018708467484, 0.0007436366286128759, -0.0017767646349966526, 0.01966259442269802, 0.03536062315106392, 0.003153544384986162, -0.09769465029239655, 0.05087607726454735, 0.08875242620706558, 0.005137774161994457, 0.07899025082588196, -0.017983851954340935, 0.031065646559000015, -0.048278700560331345, 0.09579166024923325, 0.0950283333659172, 0.02628648281097412, -0.053477101027965546, 0.17830096185207367, 0.01225209143012762, 0.0634276270866394, -0.051222480833530426, -0.035118766129016876, -0.10391327738761902, 0.006104505620896816, -0.08454829454421997, -0.027481069788336754, -0.07038741558790207, -0.0323387049138546, -0.03272446244955063, -0.05283132940530777, -0.0071152588352561, 0.058840829879045486, -0.03260383754968643, -0.050626568496227264, -0.04723351448774338, 0.026061005890369415, -0.15448544919490814, 0.012621378526091576, 0.08851451426744461, -0.07393155992031097, 0.14910751581192017, 0.04844384640455246, -0.035261157900094986, 0.03291510418057442, -0.06133972853422165, 0.03219790756702423, -0.03966328129172325, -0.012174069881439209, 0.06412004679441452, -0.07431700825691223, 0.02563190646469593, -0.08603369444608688, -0.05778002366423607, 0.0074661350809037685, 0.04526027292013168, -0.12762100994586945, 0.02476905658841133, 0.0005521032144315541, -0.002122231526300311, -0.08594141155481339, 0.05183231085538864, 0.051307156682014465, 0.04302448406815529, 0.08022808283567429, -0.053919170051813126, 0.08525554835796356, -0.19891759753227234, -0.046949662268161774, 0.0489492267370224, -0.0335555374622345, 0.02119121141731739, -0.013970390893518925, 0.07497859746217728, -0.0513622872531414, 0.164493128657341, 0.015923354774713516, -0.031595390290021896, 0.05805768072605133, -0.06592845171689987, -0.08106344938278198, 0.06349601596593857, 0.03362319618463516, -0.04802732914686203, -0.032778624445199966, -0.07572592049837112, -0.027521569281816483, -0.0631730780005455, -0.06069502234458923, 0.17888283729553223, 0.11885969340801239, 0.12346436828374863, -0.0491100437939167, 0.05026663467288017, -0.028216751292347908, -0.16000036895275116, -0.09022560715675354, -0.025404008105397224, 0.0334407277405262, -0.08691874146461487, 0.05656279996037483, 0.1539267748594284, -0.15455876290798187, 0.107005774974823, -0.03153108432888985, -0.05993426591157913, -0.07363002747297287, -0.15853795409202576, -0.009838081896305084, -0.034923892468214035, 0.016072221100330353, -0.09855081140995026, 0.08476542681455612, 0.043968599289655685, 0.06802459806203842, -0.049958236515522, 0.09827017039060593, -0.03304595500230789, -0.0936545580625534, 0.07075408846139908, 0.053731996566057205, 0.022728320211172104, 0.09147153049707413, 0.05115588754415512, -0.020977722480893135, 0.03884556517004967, 0.07087741792201996, 0.03173234686255455, -0.03280164673924446, -0.0100471843034029, -0.03151015192270279, -0.07229112088680267, 0.018811572343111038, -0.006833859719336033, -0.04411396384239197, 0.11351611465215683, 0.048128727823495865, -0.008378430269658566, -0.021052224561572075, 0.30129843950271606, -0.05221741646528244, -0.05419580638408661, -0.15044501423835754, 0.2369777262210846, 0.024136563763022423, 0.01431811973452568, 0.031388696283102036, -0.1422618329524994, 0.003641017246991396, 0.14822860062122345, 0.24482333660125732, -0.06561960279941559, 0.02634389139711857, -0.0005428040749393404, 0.016539622098207474, 0.04802359640598297, 0.11376038938760757, 0.08037017285823822, 0.1468793898820877, -0.04274271801114082, 0.094022735953331, -0.029712624847888947, -0.05911606177687645, -0.047393396496772766, 0.09247975796461105, 0.0196718517690897, 0.03474074602127075, -0.09610766172409058, 0.08929180353879929, -0.06510668992996216, -0.34623026847839355, 0.0063139284029603004, -0.06105858087539673, -0.11438216269016266, 0.027549168094992638, -0.05447468161582947, 0.006668252870440483, 0.05296988785266876, 0.06019420921802521, 0.012875896878540516, 0.09859436750411987, 0.0682796835899353, -0.01402986515313387, -0.010491297580301762, 0.10010478645563126, -0.11377984285354614, 0.1299189180135727, -0.010455171577632427, 0.012933677062392235, 0.08295665681362152, 0.03377533331513405, -0.11122641712427139, 0.0507325641810894, 0.02944953925907612, -0.04849530756473541, -0.02869727835059166, 0.16803307831287384, 0.016949746757745743, -0.03281408175826073, 0.026115892454981804, -0.05550635978579521, -0.002358553232625127, -0.11064492911100388, 0.03638167306780815, -0.10665888339281082, 0.04016304388642311, -0.026975451037287712, 0.14150838553905487, 0.1821669489145279, -0.05562172457575798, 0.05806223675608635, -0.06864997744560242, -0.014059875160455704, -0.003512546420097351, 0.068825863301754, -0.024278346449136734, -0.12771075963974, 0.0020319190807640553, 0.010388823226094246, 0.041989799588918686, -0.14133156836032867, -0.0571419820189476, -0.015424640849232674, -0.017811018973588943, -0.032420556992292404, 0.18982712924480438, 0.07011539489030838, 0.05894146114587784, -0.0325888954102993, -0.01336154155433178, 0.004746219143271446, 0.13339324295520782, -0.16598296165466309, -0.026754682883620262 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1343581966541545472/Bs7oM0IV_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ada IO 🤖 AI Bot </div> <div style="font-size: 15px">@ioorbust bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ioorbust's tweets](https://twitter.com/ioorbust). | Data | Quantity | | --- | --- | | Tweets downloaded | 789 | | Retweets | 79 | | Short tweets | 102 | | Tweets kept | 608 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/zuxd4c8i/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ioorbust's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1nt569uh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1nt569uh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ioorbust') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ioorbust/1617757328084/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ioorbust
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Ada IO AI Bot @ioorbust bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @ioorbust's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ioorbust's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1144996963252940800/VIHkkMCF_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">⌞ʙᴀʟᴀᴢꜱ⌝ 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@iotnerd bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@iotnerd's tweets](https://twitter.com/iotnerd). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3200</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>915</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>102</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2183</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2gq45sm3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @iotnerd's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ksu06s41) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ksu06s41/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/iotnerd'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/iotnerd/1611677898375/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/iotnerd
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">⌞ʙᴀʟᴀᴢꜱ⌝ AI Bot </div> <div style="font-size: 15px; color: #657786">@iotnerd bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @iotnerd's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3200</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>915</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>102</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2183</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @iotnerd's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/iotnerd'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @iotnerd's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3200</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>915</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>102</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2183</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @iotnerd's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/iotnerd'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @iotnerd's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3200</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>915</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>102</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2183</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @iotnerd's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/iotnerd'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 430, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1420146838779400197/98VH7-UW_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Ivan Poduje</div> <div style="text-align: center; font-size: 14px;">@ipoduje</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Ivan Poduje. | Data | Ivan Poduje | | --- | --- | | Tweets downloaded | 3230 | | Retweets | 1035 | | Short tweets | 135 | | Tweets kept | 2060 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/gyttyi09/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ipoduje's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/29wmg1mk) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/29wmg1mk/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ipoduje') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/ipoduje/1641572179072/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ipoduje
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Ivan Poduje @ipoduje I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Ivan Poduje. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ipoduje's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1432037158072856578/a_Fty68E_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Riikka Purra</div> <div style="text-align: center; font-size: 14px;">@ir_rkp</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Riikka Purra. | Data | Riikka Purra | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 141 | | Short tweets | 78 | | Tweets kept | 3031 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1w0bzvgu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ir_rkp's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1nj4v31w) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1nj4v31w/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ir_rkp') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/ir_rkp/1643976228944/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ir_rkp
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Riikka Purra @ir\_rkp I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Riikka Purra. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ir\_rkp's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360344954485227529/r2dktZMm_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Kevin 🤖 AI Bot </div> <div style="font-size: 15px">@is_he_batman bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@is_he_batman's tweets](https://twitter.com/is_he_batman). | Data | Quantity | | --- | --- | | Tweets downloaded | 960 | | Retweets | 51 | | Short tweets | 75 | | Tweets kept | 834 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/25g6159m/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @is_he_batman's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2yerrfcg) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2yerrfcg/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/is_he_batman') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/is_he_batman/1614109879160/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/is_he_batman
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Kevin AI Bot @is\_he\_batman bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @is\_he\_batman's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @is\_he\_batman's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1092831572645036035/yvgPGtOn_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Ishan 🤖 AI Bot </div> <div style="font-size: 15px">@ishanspatil bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ishanspatil's tweets](https://twitter.com/ishanspatil). | Data | Quantity | | --- | --- | | Tweets downloaded | 2468 | | Retweets | 346 | | Short tweets | 231 | | Tweets kept | 1891 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/4iupc1l1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ishanspatil's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/k7nyg63n) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/k7nyg63n/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ishanspatil') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ishanspatil/1617782474953/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ishanspatil
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Ishan AI Bot @ishanspatil bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @ishanspatil's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ishanspatil's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1448436144388009985/zWh5cSQ3_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">نورهان</div> <div style="text-align: center; font-size: 14px;">@islamocommunism</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from نورهان. | Data | نورهان | | --- | --- | | Tweets downloaded | 3196 | | Retweets | 1205 | | Short tweets | 227 | | Tweets kept | 1764 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2l8ikj22/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @islamocommunism's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2kngkxcq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2kngkxcq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/islamocommunism') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/islamocommunism/1635014280450/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/islamocommunism
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT نورهان @islamocommunism I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from نورهان. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @islamocommunism's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1381764452098437120/74IgKP07_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1368077075127603200/Z08slO2P_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Boston Psychology PhD & keyvan</div> <div style="text-align: center; font-size: 14px;">@islamphobiacow-praisegodbarbon</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Boston Psychology PhD & keyvan. | Data | Boston Psychology PhD | keyvan | | --- | --- | --- | | Tweets downloaded | 3224 | 3242 | | Retweets | 858 | 179 | | Short tweets | 251 | 223 | | Tweets kept | 2115 | 2840 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3egvdux4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @islamphobiacow-praisegodbarbon's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/34hmjrwi) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/34hmjrwi/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/islamphobiacow-praisegodbarbon') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/islamphobiacow-praisegodbarbon/1627056382131/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/islamphobiacow-praisegodbarbon
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Boston Psychology PhD & keyvan @islamphobiacow-praisegodbarbon I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Boston Psychology PhD & keyvan. Data: Tweets downloaded, Boston Psychology PhD: 3224, keyvan: 3242 Data: Retweets, Boston Psychology PhD: 858, keyvan: 179 Data: Short tweets, Boston Psychology PhD: 251, keyvan: 223 Data: Tweets kept, Boston Psychology PhD: 2115, keyvan: 2840 Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @islamphobiacow-praisegodbarbon's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1368077075127603200/Z08slO2P_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">beff jezos</div> <div style="text-align: center; font-size: 14px;">@islamphobiacow</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from beff jezos. | Data | beff jezos | | --- | --- | | Tweets downloaded | 395 | | Retweets | 36 | | Short tweets | 37 | | Tweets kept | 322 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1crtakdb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @islamphobiacow's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/29lljwti) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/29lljwti/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/islamphobiacow') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/islamphobiacow/1627597861566/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/islamphobiacow
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT beff jezos @islamphobiacow I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from beff jezos. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @islamphobiacow's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1344394699470082049/YzE4UMsj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Rizza Islam 🤖 AI Bot </div> <div style="font-size: 15px">@islamrizza bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@islamrizza's tweets](https://twitter.com/islamrizza). | Data | Quantity | | --- | --- | | Tweets downloaded | 3195 | | Retweets | 73 | | Short tweets | 394 | | Tweets kept | 2728 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/t09cn5o0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @islamrizza's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/m6l6wkff) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/m6l6wkff/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/islamrizza') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/islamrizza/1619378181874/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/islamrizza
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Rizza Islam AI Bot @islamrizza bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @islamrizza's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @islamrizza's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1152361188862365697/HWUuVltf_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">nick casino 🤖 AI Bot </div> <div style="font-size: 15px">@island_iverson bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@island_iverson's tweets](https://twitter.com/island_iverson). | Data | Quantity | | --- | --- | | Tweets downloaded | 3182 | | Retweets | 367 | | Short tweets | 193 | | Tweets kept | 2622 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/dlr58v3e/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @island_iverson's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vy3qci6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vy3qci6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/island_iverson') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/island_iverson/1614113195211/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/island_iverson
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
nick casino AI Bot @island\_iverson bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @island\_iverson's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @island\_iverson's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1340996475472494593/yqCQjZ06_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1341001037142999041/h86Ch8TO_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Science Bits & International Science Teaching Foundation</div> <div style="text-align: center; font-size: 14px;">@istfoundation-sciencebits</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Science Bits & International Science Teaching Foundation. | Data | Science Bits | International Science Teaching Foundation | | --- | --- | --- | | Tweets downloaded | 2741 | 163 | | Retweets | 759 | 103 | | Short tweets | 47 | 1 | | Tweets kept | 1935 | 59 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/c9crff9r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @istfoundation-sciencebits's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2c68vj42) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2c68vj42/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/istfoundation-sciencebits') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/istfoundation-sciencebits/1634209108264/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/istfoundation-sciencebits
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG Science Bits & International Science Teaching Foundation @istfoundation-sciencebits I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Science Bits & International Science Teaching Foundation. Data: Tweets downloaded, Science Bits: 2741, International Science Teaching Foundation: 163 Data: Retweets, Science Bits: 759, International Science Teaching Foundation: 103 Data: Short tweets, Science Bits: 47, International Science Teaching Foundation: 1 Data: Tweets kept, Science Bits: 1935, International Science Teaching Foundation: 59 Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @istfoundation-sciencebits's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1359348009725808641/KyPjQGzk_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">itemLabel 🤖 AI Bot </div> <div style="font-size: 15px">@itemlabel bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itemlabel's tweets](https://twitter.com/itemlabel). | Data | Quantity | | --- | --- | | Tweets downloaded | 3188 | | Retweets | 1796 | | Short tweets | 389 | | Tweets kept | 1003 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/10hookja/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itemlabel's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1u63m0wj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1u63m0wj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itemlabel') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itemlabel
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
itemLabel AI Bot @itemlabel bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itemlabel's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itemlabel's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1378965688459489280/VViTlDIl_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Google ‘Its All Bullshit’ 🤖 AI Bot </div> <div style="font-size: 15px">@itsall_bullshit bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itsall_bullshit's tweets](https://twitter.com/itsall_bullshit). | Data | Quantity | | --- | --- | | Tweets downloaded | 3158 | | Retweets | 1762 | | Short tweets | 98 | | Tweets kept | 1298 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/25y8c5ov/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsall_bullshit's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/y0ks8zfn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/y0ks8zfn/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsall_bullshit') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsall_bullshit/1617823122662/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itsall_bullshit
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Google ‘Its All Bullshit’ AI Bot @itsall\_bullshit bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itsall\_bullshit's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsall\_bullshit's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1346051916556623875/e66ZNvO2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Big Ian 🤖 AI Bot </div> <div style="font-size: 15px">@itsbigian bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itsbigian's tweets](https://twitter.com/itsbigian). | Data | Quantity | | --- | --- | | Tweets downloaded | 3238 | | Retweets | 218 | | Short tweets | 552 | | Tweets kept | 2468 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2oczo3b8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsbigian's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/245obnds) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/245obnds/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsbigian') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsbigian/1616883483325/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itsbigian
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Big Ian AI Bot @itsbigian bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itsbigian's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsbigian's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1376928476633137157/d4J78Fmv_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Harveen 🤖 AI Bot </div> <div style="font-size: 15px">@itsharveen bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itsharveen's tweets](https://twitter.com/itsharveen). | Data | Quantity | | --- | --- | | Tweets downloaded | 632 | | Retweets | 30 | | Short tweets | 40 | | Tweets kept | 562 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/a779ia8t/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsharveen's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1dip1d5b) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1dip1d5b/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsharveen') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsharveen/1617627052674/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itsharveen
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Harveen AI Bot @itsharveen bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itsharveen's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsharveen's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1348712980112936966/i5-XHX3G_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">jane.flowers 🤖 AI Bot </div> <div style="font-size: 15px">@itsjaneflowers bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itsjaneflowers's tweets](https://twitter.com/itsjaneflowers). | Data | Quantity | | --- | --- | | Tweets downloaded | 1054 | | Retweets | 166 | | Short tweets | 79 | | Tweets kept | 809 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1af8sp4r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsjaneflowers's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/25kv3ol0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/25kv3ol0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsjaneflowers') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsjaneflowers/1616859152962/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itsjaneflowers
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
jane.flowers AI Bot @itsjaneflowers bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itsjaneflowers's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsjaneflowers's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1355537154538000391/0mOGv6Mw_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">june party corner</div> <div style="text-align: center; font-size: 14px;">@itskillerdog</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from june party corner. | Data | june party corner | | --- | --- | | Tweets downloaded | 196 | | Retweets | 20 | | Short tweets | 30 | | Tweets kept | 146 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1u7twx27/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itskillerdog's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vg0bbs8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vg0bbs8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itskillerdog') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itskillerdog/1630971994166/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itskillerdog
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT june party corner @itskillerdog I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from june party corner. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itskillerdog's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1337464500446957570/ptHOR4kZ_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Luci Keller 🤖 AI Bot </div> <div style="font-size: 15px">@itslucikeller bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itslucikeller's tweets](https://twitter.com/itslucikeller). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 69 | | Short tweets | 352 | | Tweets kept | 2825 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3nhr24ju/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itslucikeller's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2zv0hvjq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2zv0hvjq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itslucikeller') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itslucikeller/1616622417664/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itslucikeller
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Luci Keller AI Bot @itslucikeller bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itslucikeller's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itslucikeller's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1389992995848658948/XT1CKTIg_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Aqsa.</div> <div style="text-align: center; font-size: 14px;">@itsmeaqsaa</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Aqsa.. | Data | Aqsa. | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 77 | | Short tweets | 1543 | | Tweets kept | 1626 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1xy28krg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itsmeaqsaa's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/18kg27bt) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/18kg27bt/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itsmeaqsaa') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itsmeaqsaa/1631734394856/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itsmeaqsaa
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Aqsa. @itsmeaqsaa I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Aqsa.. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itsmeaqsaa's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1365914927580344322/b5PadSd5_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">NFT ChiΞf of Staff 🤖 AI Bot </div> <div style="font-size: 15px">@itspublu bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@itspublu's tweets](https://twitter.com/itspublu). | Data | Quantity | | --- | --- | | Tweets downloaded | 1768 | | Retweets | 481 | | Short tweets | 282 | | Tweets kept | 1005 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2l8q7e87/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itspublu's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1vo0wnnt) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1vo0wnnt/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itspublu') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itspublu/1616709602963/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itspublu
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
NFT ChiΞf of Staff AI Bot @itspublu bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @itspublu's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itspublu's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/628257137060229120/_3q_D4g2_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Six words story</div> <div style="text-align: center; font-size: 14px;">@itssixword</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Six words story. | Data | Six words story | | --- | --- | | Tweets downloaded | 282 | | Retweets | 0 | | Short tweets | 2 | | Tweets kept | 280 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2dbtmbzz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @itssixword's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2wydugsv) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2wydugsv/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/itssixword') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/itssixword/1629833127428/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/itssixword
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Six words story @itssixword I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Six words story. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @itssixword's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1457774258063437824/VgJyJ_c2_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">uditgoenka.eth</div> <div style="text-align: center; font-size: 14px;">@iuditg</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from uditgoenka.eth. | Data | uditgoenka.eth | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 993 | | Short tweets | 450 | | Tweets kept | 1807 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1r2lhfr0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @iuditg's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/iswph9y4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/iswph9y4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/iuditg') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/iuditg/1639532212187/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/iuditg
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT URL @iuditg I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from URL. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @iuditg's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1202257345734037504/tRJA6HEx_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">| praveen narayan 〉 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@ivanpeer bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@ivanpeer's tweets](https://twitter.com/ivanpeer). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>971</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>110</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>102</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>759</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/2thafoo8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ivanpeer's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3fepz7hm) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3fepz7hm/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ivanpeer'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ivanpeer/1603607581850/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ivanpeer
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">| praveen narayan 〉 AI Bot </div> <div style="font-size: 15px; color: #657786">@ivanpeer bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @ivanpeer's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>971</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>110</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>102</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>759</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @ivanpeer's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/ivanpeer'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @ivanpeer's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>971</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>110</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>102</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>759</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @ivanpeer's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/ivanpeer'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @ivanpeer's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>971</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>110</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>102</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>759</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @ivanpeer's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/ivanpeer'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 429, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1404902950607089665/CLa3e4aK_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">##lainpilled</div> <div style="text-align: center; font-size: 14px;">@ivegottagetagf</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from ##lainpilled. | Data | ##lainpilled | | --- | --- | | Tweets downloaded | 128 | | Retweets | 7 | | Short tweets | 16 | | Tweets kept | 105 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/7kyd6ojb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @ivegottagetagf's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3ropyewj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3ropyewj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/ivegottagetagf') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/ivegottagetagf/1623876885491/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/ivegottagetagf
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT ##lainpilled @ivegottagetagf I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from ##lainpilled. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @ivegottagetagf's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/598663964340301824/im3Wzn-o_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Robert Evans (The Only Robert Evans)</div> <div style="text-align: center; font-size: 14px;">@iwriteok</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Robert Evans (The Only Robert Evans). | Data | Robert Evans (The Only Robert Evans) | | --- | --- | | Tweets downloaded | 3218 | | Retweets | 1269 | | Short tweets | 142 | | Tweets kept | 1807 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3hjcp2ib/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @iwriteok's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/wq4n95ia) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/wq4n95ia/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/iwriteok') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/iwriteok/1668924855688/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/iwriteok
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Robert Evans (The Only Robert Evans) @iwriteok I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Robert Evans (The Only Robert Evans). Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @iwriteok's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1393393642958635008/P1qx1TlP_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">웃</div> <div style="text-align: center; font-size: 14px;">@iyxnmt</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from 웃. | Data | 웃 | | --- | --- | | Tweets downloaded | 3073 | | Retweets | 1416 | | Short tweets | 660 | | Tweets kept | 997 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1lpd2izx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @iyxnmt's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3qg153k0) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3qg153k0/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/iyxnmt') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/iyxnmt/1621146502054/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/iyxnmt
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT 웃 @iyxnmt I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from 웃. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @iyxnmt's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1311033052592656385/V-9XECfj_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jamie Beck 🤖 AI Bot </div> <div style="font-size: 15px">@j_beck00 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@j_beck00's tweets](https://twitter.com/j_beck00). | Data | Quantity | | --- | --- | | Tweets downloaded | 75 | | Retweets | 14 | | Short tweets | 4 | | Tweets kept | 57 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/23mq58mv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @j_beck00's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2mbmtl4r) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2mbmtl4r/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/j_beck00') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/j_beck00/1617471704579/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/j_beck00
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jamie Beck AI Bot @j\_beck00 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @j\_beck00's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @j\_beck00's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1333957151576887297/_1ExBQa3_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jocelyn (male) of the 365 Followers 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@j_j_j_j_j_jones bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@j_j_j_j_j_jones's tweets](https://twitter.com/j_j_j_j_j_jones). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3225</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>320</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>482</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2423</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/uz60miha/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @j_j_j_j_j_jones's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/soi1lw7l) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/soi1lw7l/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/j_j_j_j_j_jones'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/j_j_j_j_j_jones/1609141746129/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/j_j_j_j_j_jones
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jocelyn (male) of the 365 Followers AI Bot </div> <div style="font-size: 15px; color: #657786">@j_j_j_j_j_jones bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @j_j_j_j_j_jones's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3225</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>320</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>482</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2423</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @j_j_j_j_j_jones's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/j_j_j_j_j_jones'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @j_j_j_j_j_jones's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3225</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>320</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>482</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2423</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @j_j_j_j_j_jones's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/j_j_j_j_j_jones'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @j_j_j_j_j_jones's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3225</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>320</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>482</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2423</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @j_j_j_j_j_jones's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/j_j_j_j_j_jones'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 439, 84, 9, 176, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1115644092329758721/AFjOr-K8_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">jack</div> <div style="text-align: center; font-size: 14px;">@jack</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from jack. | Data | jack | | --- | --- | | Tweets downloaded | 3231 | | Retweets | 1147 | | Short tweets | 817 | | Tweets kept | 1267 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/dibfzjll/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jack's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3f3e0roo) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3f3e0roo/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jack') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/jack/1653287961086/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jack
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT jack @jack I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from jack. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jack's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1314324082457018369/nPHIUIxe_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jack Walsh 🤖 AI Bot </div> <div style="font-size: 15px">@jack_walshh bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jack_walshh's tweets](https://twitter.com/jack_walshh). | Data | Quantity | | --- | --- | | Tweets downloaded | 1095 | | Retweets | 234 | | Short tweets | 121 | | Tweets kept | 740 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1o93caoq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jack_walshh's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/23dq75x4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/23dq75x4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jack_walshh') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jack_walshh/1616646386178/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jack_walshh
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jack Walsh AI Bot @jack\_walshh bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jack\_walshh's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jack\_walshh's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1251200537388695557/96JxUIrJ_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1384243878748856321/vreel6UH_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1417910390051246080/wKq6pjPR_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">DAN KOE & humble farmer & Jack Butcher</div> <div style="text-align: center; font-size: 14px;">@jackbutcher-paikcapital-thedankoe</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from DAN KOE & humble farmer & Jack Butcher. | Data | DAN KOE | humble farmer | Jack Butcher | | --- | --- | --- | --- | | Tweets downloaded | 3249 | 3247 | 3220 | | Retweets | 18 | 601 | 208 | | Short tweets | 899 | 500 | 1048 | | Tweets kept | 2332 | 2146 | 1964 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/mvqun4ol/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackbutcher-paikcapital-thedankoe's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2qd8720q) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2qd8720q/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jackbutcher-paikcapital-thedankoe') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jackbutcher-paikcapital-thedankoe
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG DAN KOE & humble farmer & Jack Butcher @jackbutcher-paikcapital-thedankoe I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from DAN KOE & humble farmer & Jack Butcher. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jackbutcher-paikcapital-thedankoe's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/726446881547517952/ULhSTKxN_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jack Clark 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jackclarksf bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://bit.ly/2TGXMZf). ## Training data The model was trained on [@jackclarksf's tweets](https://twitter.com/jackclarksf). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3216</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>603</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>187</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2426</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3r89xyps/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackclarksf's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3ovybsy5) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3ovybsy5/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jackclarksf'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jackclarksf
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jack Clark AI Bot </div> <div style="font-size: 15px; color: #657786">@jackclarksf bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jackclarksf's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3216</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>603</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>187</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2426</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jackclarksf's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jackclarksf'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jackclarksf's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3216</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>603</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>187</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2426</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jackclarksf's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jackclarksf'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jackclarksf's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3216</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>603</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>187</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2426</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jackclarksf's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jackclarksf'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 432, 77, 9, 169, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1316066224938332162/hVIofspH_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">JackGordon 🤖 AI Bot </div> <div style="font-size: 15px">@jackgordonyt bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jackgordonyt's tweets](https://twitter.com/jackgordonyt). | Data | Quantity | | --- | --- | | Tweets downloaded | 660 | | Retweets | 146 | | Short tweets | 106 | | Tweets kept | 408 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3d7wzfbd/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackgordonyt's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/fa0cjwj6) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/fa0cjwj6/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jackgordonyt') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jackgordonyt/1615830241451/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jackgordonyt
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
JackGordon AI Bot @jackgordonyt bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jackgordonyt's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jackgordonyt's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1365531191470923776/iPBbGURg_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">JackieRacc_VTuber</div> <div style="text-align: center; font-size: 14px;">@jackieracc_</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from JackieRacc_VTuber. | Data | JackieRacc_VTuber | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 252 | | Short tweets | 827 | | Tweets kept | 2170 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/gx7e8h18/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackieracc_'s tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1cvwo68s) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1cvwo68s/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jackieracc_') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jackieracc_/1620680912006/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jackieracc_
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT JackieRacc\_VTuber @jackieracc\_ I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from JackieRacc\_VTuber. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jackieracc\_'s tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1026642891374874625/GPdw8p_L_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jacknjellify 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jacknjellify bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jacknjellify's tweets](https://twitter.com/jacknjellify). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3103</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1025</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>336</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1742</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/nmeryp1f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jacknjellify's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3q5b8kag) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3q5b8kag/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jacknjellify'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jacknjellify
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jacknjellify AI Bot </div> <div style="font-size: 15px; color: #657786">@jacknjellify bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jacknjellify's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3103</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>1025</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>336</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>1742</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jacknjellify's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jacknjellify'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jacknjellify's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3103</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1025</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>336</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1742</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jacknjellify's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jacknjellify'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jacknjellify's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3103</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>1025</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>336</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>1742</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jacknjellify's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jacknjellify'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 432, 76, 9, 168, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1418813091140227072/iXDCqBz0_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jack Posobiec 🇺🇸</div> <div style="text-align: center; font-size: 14px;">@jackposobiec</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jack Posobiec 🇺🇸. | Data | Jack Posobiec 🇺🇸 | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 818 | | Short tweets | 511 | | Tweets kept | 1917 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3s4mnium/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jackposobiec's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2vllrmfa) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2vllrmfa/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jackposobiec') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jackposobiec/1630169093455/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jackposobiec
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jack Posobiec 🇺🇸 @jackposobiec I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jack Posobiec 🇺🇸. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jackposobiec's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1523106752668966913/tWNV2zbS_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">jacksfilms🌹</div> <div style="text-align: center; font-size: 14px;">@jacksfilms</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from jacksfilms🌹. | Data | jacksfilms🌹 | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 97 | | Short tweets | 444 | | Tweets kept | 2708 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1hsenlsv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jacksfilms's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2ow20675) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2ow20675/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jacksfilms') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/jacksfilms/1653095886748/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jacksfilms
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT jacksfilms @jacksfilms I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from jacksfilms. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jacksfilms's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1284100202421342209/MVXATULR_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Day6 Jae 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jae_day6 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jae_day6's tweets](https://twitter.com/jae_day6). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3229</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>123</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>1021</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2085</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/3lpvhxwq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jae_day6's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/3vyjrutx) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/3vyjrutx/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jae_day6'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jae_day6/1601274497991/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jae_day6
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Day6 Jae AI Bot </div> <div style="font-size: 15px; color: #657786">@jae_day6 bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jae_day6's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3229</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>123</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>1021</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2085</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jae_day6's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jae_day6'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jae_day6's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3229</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>123</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>1021</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2085</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jae_day6's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jae_day6'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jae_day6's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3229</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>123</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>1021</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2085</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jae_day6's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jae_day6'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 432, 77, 9, 169, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1410183697534439426/Db5MDUaw_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Programo, luego existo</div> <div style="text-align: center; font-size: 14px;">@jagedn</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Programo, luego existo. | Data | Programo, luego existo | | --- | --- | | Tweets downloaded | 3244 | | Retweets | 549 | | Short tweets | 220 | | Tweets kept | 2475 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/ptz28obp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jagedn's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1i8g6srp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1i8g6srp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jagedn') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jagedn/1625062317603/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jagedn
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Programo, luego existo @jagedn I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Programo, luego existo. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jagedn's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374457518995349507/LPSYSW4N_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">certified cael moment™ 🔜 BLFC 🤖 AI Bot </div> <div style="font-size: 15px">@jaguarunlocked bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jaguarunlocked's tweets](https://twitter.com/jaguarunlocked). | Data | Quantity | | --- | --- | | Tweets downloaded | 3176 | | Retweets | 1521 | | Short tweets | 203 | | Tweets kept | 1452 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2j5t38f8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jaguarunlocked's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3n6tm7lj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3n6tm7lj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jaguarunlocked') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jaguarunlocked/1617770655879/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jaguarunlocked
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
certified cael moment™ BLFC AI Bot @jaguarunlocked bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jaguarunlocked's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jaguarunlocked's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1374180612609880068/QkkHvC6R_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">jacob 🤖 AI Bot </div> <div style="font-size: 15px">@jakeaccino bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jakeaccino's tweets](https://twitter.com/jakeaccino). | Data | Quantity | | --- | --- | | Tweets downloaded | 179 | | Retweets | 8 | | Short tweets | 53 | | Tweets kept | 118 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/239ufxkc/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jakeaccino's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3myo5k1y) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3myo5k1y/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jakeaccino') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jakeaccino
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
jacob AI Bot @jakeaccino bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jakeaccino's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jakeaccino's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('http://pbs.twimg.com/profile_images/1253134985948614657/xN4lDF3W_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Cham ✍🏻 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jamescham bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://bit.ly/2TGXMZf). ## Training data The model was trained on [@jamescham's tweets](https://twitter.com/jamescham). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3213</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>744</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>317</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2152</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/20ku8js2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamescham's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/32to3ioi) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/32to3ioi/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jamescham'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo_share.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jamescham
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Cham AI Bot </div> <div style="font-size: 15px; color: #657786">@jamescham bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jamescham's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3213</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>744</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>317</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2152</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jamescham's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jamescham'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jamescham's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3213</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>744</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>317</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2152</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jamescham's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jamescham'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jamescham's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3213</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>744</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>317</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2152</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jamescham's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jamescham'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 430, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1420806762408464385/10y3M0iO_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1324782032124215296/HMG6-q8g_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1401837042934468611/okzqIoMb_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">CANCELLED & James Charles & Logan Paul</div> <div style="text-align: center; font-size: 14px;">@jamescharles-loganpaul-tanamongeau</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from CANCELLED & James Charles & Logan Paul. | Data | CANCELLED | James Charles | Logan Paul | | --- | --- | --- | --- | | Tweets downloaded | 3167 | 3182 | 3246 | | Retweets | 938 | 480 | 98 | | Short tweets | 522 | 496 | 287 | | Tweets kept | 1707 | 2206 | 2861 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2avr905u/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamescharles-loganpaul-tanamongeau's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2at101p1) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2at101p1/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jamescharles-loganpaul-tanamongeau') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jamescharles-loganpaul-tanamongeau/1631598787303/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jamescharles-loganpaul-tanamongeau
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI CYBORG CANCELLED & James Charles & Logan Paul @jamescharles-loganpaul-tanamongeau I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from CANCELLED & James Charles & Logan Paul. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jamescharles-loganpaul-tanamongeau's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/958932211973152769/FUpkmn4u_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Clear 🤖 AI Bot </div> <div style="font-size: 15px">@jamesclear bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jamesclear's tweets](https://twitter.com/jamesclear). | Data | Quantity | | --- | --- | | Tweets downloaded | 3247 | | Retweets | 190 | | Short tweets | 385 | | Tweets kept | 2672 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2hvyoab9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamesclear's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/v67076s3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/v67076s3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jamesclear') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jamesclear/1616666243525/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jamesclear
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
James Clear AI Bot @jamesclear bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jamesclear's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jamesclear's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1309510427240534022/Us-RCD-5_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Hutton 🤖 AI Bot </div> <div style="font-size: 15px">@jameshuttonphil bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jameshuttonphil's tweets](https://twitter.com/jameshuttonphil). | Data | Quantity | | --- | --- | | Tweets downloaded | 648 | | Retweets | 25 | | Short tweets | 89 | | Tweets kept | 534 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/bamdk9dm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jameshuttonphil's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2jp3j37a) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2jp3j37a/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jameshuttonphil') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jameshuttonphil/1617296338533/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jameshuttonphil
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
James Hutton AI Bot @jameshuttonphil bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jameshuttonphil's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jameshuttonphil's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1040878369594896384/eusyG8Np_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">James Sherlock 🤖 AI Bot </div> <div style="font-size: 15px">@jamespsherlock bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jamespsherlock's tweets](https://twitter.com/jamespsherlock). | Data | Quantity | | --- | --- | | Tweets downloaded | 743 | | Retweets | 260 | | Short tweets | 44 | | Tweets kept | 439 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2ulatc4k/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamespsherlock's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1btltx5f) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1btltx5f/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jamespsherlock') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jamespsherlock/1616781166201/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jamespsherlock
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
James Sherlock AI Bot @jamespsherlock bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jamespsherlock's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jamespsherlock's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1375502415240122373/JO1DArJT_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jamila Husain</div> <div style="text-align: center; font-size: 14px;">@jamz5251</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jamila Husain. | Data | Jamila Husain | | --- | --- | | Tweets downloaded | 3234 | | Retweets | 900 | | Short tweets | 65 | | Tweets kept | 2269 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/r9z40rld/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jamz5251's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/20gadkdv) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/20gadkdv/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jamz5251') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jamz5251/1622370618440/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jamz5251
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jamila Husain @jamz5251 I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jamila Husain. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jamz5251's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1536389142287892481/N6kCwACw_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Columbine Janie</div> <div style="text-align: center; font-size: 14px;">@janieclone</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Columbine Janie. | Data | Columbine Janie | | --- | --- | | Tweets downloaded | 3072 | | Retweets | 1211 | | Short tweets | 462 | | Tweets kept | 1399 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1divgffx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @janieclone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1ic6ynmd) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1ic6ynmd/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/janieclone') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/janieclone
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Columbine Janie @janieclone I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Columbine Janie. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @janieclone's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1455690959132532738/Z4UvDtLA_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Poolgirl Janie Diamond</div> <div style="text-align: center; font-size: 14px;">@janiedied</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Poolgirl Janie Diamond. | Data | Poolgirl Janie Diamond | | --- | --- | | Tweets downloaded | 1505 | | Retweets | 552 | | Short tweets | 283 | | Tweets kept | 670 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3232onrl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @janiedied's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/smx9pf1l) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/smx9pf1l/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/janiedied') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/janiedied/1645111847557/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/janiedied
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Poolgirl Janie Diamond @janiedied I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Poolgirl Janie Diamond. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @janiedied's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/2158604209/feuilles_400x400.png')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Au Jardin 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@jardininfo bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jardininfo's tweets](https://twitter.com/jardininfo). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3200</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>375</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2825</td> </tr> </tbody> </table> [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/48yjj01v/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jardininfo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3t0scjqn) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3t0scjqn/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jardininfo'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jardininfo/1610568803876/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jardininfo
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Au Jardin AI Bot </div> <div style="font-size: 15px; color: #657786">@jardininfo bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @jardininfo's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3200</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>0</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>375</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>2825</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @jardininfo's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/jardininfo'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jardininfo's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3200</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>375</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2825</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jardininfo's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jardininfo'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @jardininfo's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3200</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>0</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>375</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>2825</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @jardininfo's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/jardininfo'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 429, 75, 9, 167, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1360504508678246401/WpE9tJiC_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jason Chen 🤖 AI Bot </div> <div style="font-size: 15px">@jasonchen0325 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jasonchen0325's tweets](https://twitter.com/jasonchen0325). | Data | Quantity | | --- | --- | | Tweets downloaded | 354 | | Retweets | 30 | | Short tweets | 9 | | Tweets kept | 315 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1wsqq8bl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jasonchen0325's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/w2gqjxjr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/w2gqjxjr/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jasonchen0325') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jasonchen0325/1616715271094/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jasonchen0325
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jason Chen AI Bot @jasonchen0325 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jasonchen0325's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jasonchen0325's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/757580607882948608/-KXkY5qL_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">J.A. Sutherland SciFi Books 🤖 AI Bot </div> <div style="font-size: 15px">@jasutherlandbks bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jasutherlandbks's tweets](https://twitter.com/jasutherlandbks). | Data | Quantity | | --- | --- | | Tweets downloaded | 3192 | | Retweets | 952 | | Short tweets | 169 | | Tweets kept | 2071 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/210hhn5z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jasutherlandbks's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/23qtgnsl) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/23qtgnsl/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jasutherlandbks') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jasutherlandbks/1616634473974/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jasutherlandbks
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
J.A. Sutherland SciFi Books AI Bot @jasutherlandbks bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jasutherlandbks's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jasutherlandbks's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1370572501269553152/Tl_3viV2_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">🧠Jattazo Shin🧠 !!COMMISSIONS OPEN!!</div> <div style="text-align: center; font-size: 14px;">@jattazo</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from 🧠Jattazo Shin🧠 !!COMMISSIONS OPEN!!. | Data | 🧠Jattazo Shin🧠 !!COMMISSIONS OPEN!! | | --- | --- | | Tweets downloaded | 3243 | | Retweets | 196 | | Short tweets | 757 | | Tweets kept | 2290 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/oc8tbgql/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jattazo's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/7n3lt4bb) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/7n3lt4bb/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jattazo') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jattazo/1620679164511/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jattazo
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jattazo Shin !!COMMISSIONS OPEN!! @jattazo I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jattazo Shin !!COMMISSIONS OPEN!!. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jattazo's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1356081411762040834/9L5GPrEi_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">🧠Jattazo Shin🧠 🤖 AI Bot </div> <div style="font-size: 15px">@jattazoshin bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jattazoshin's tweets](https://twitter.com/jattazoshin). | Data | Quantity | | --- | --- | | Tweets downloaded | 2768 | | Retweets | 179 | | Short tweets | 414 | | Tweets kept | 2175 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2gto8yaa/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jattazoshin's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1gdg6xx3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1gdg6xx3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jattazoshin') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jattazoshin/1613105660546/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jattazoshin
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jattazo Shin AI Bot @jattazoshin bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jattazoshin's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jattazoshin's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1377879160173993987/20XH6CdP_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Cool Narcissist 🤖 AI Bot </div> <div style="font-size: 15px">@java_jigga bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@java_jigga's tweets](https://twitter.com/java_jigga). | Data | Quantity | | --- | --- | | Tweets downloaded | 3246 | | Retweets | 313 | | Short tweets | 426 | | Tweets kept | 2507 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/kvpyc8u1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @java_jigga's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/6p3ishch) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/6p3ishch/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/java_jigga') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/java_jigga/1617788084385/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/java_jigga
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Cool Narcissist AI Bot @java\_jigga bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @java\_jigga's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @java\_jigga's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1329236115325390850/L6QYc5Qd_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Javi Ballester 🤖 AI Bot </div> <div style="font-size: 15px">@javiballester4 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@javiballester4's tweets](https://twitter.com/javiballester4). | Data | Quantity | | --- | --- | | Tweets downloaded | 369 | | Retweets | 4 | | Short tweets | 108 | | Tweets kept | 257 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/33xklndf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @javiballester4's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/s6kbzp61) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/s6kbzp61/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/javiballester4') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/javiballester4/1616630748333/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/javiballester4
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Javi Ballester AI Bot @javiballester4 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @javiballester4's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @javiballester4's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1355323445622558725/vl1-gUcf_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Javierhalamadrid 🤖 AI Bot </div> <div style="font-size: 15px">@javierito321 bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@javierito321's tweets](https://twitter.com/javierito321). | Data | Quantity | | --- | --- | | Tweets downloaded | 3243 | | Retweets | 144 | | Short tweets | 90 | | Tweets kept | 3009 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/36rnr3vs/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @javierito321's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/bj56pvfw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/bj56pvfw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/javierito321') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/javierito321/1617016242704/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/javierito321
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Javierhalamadrid AI Bot @javierito321 bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @javierito321's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @javierito321's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<link rel="stylesheet" href="https://unpkg.com/@tailwindcss/[email protected]/dist/typography.min.css"> <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/922569683420794880/bk2ERDe2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gabor Javorszky 🤖 AI Bot </div> <div style="font-size: 15px; color: #657786">@javorszky bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@javorszky's tweets](https://twitter.com/javorszky). <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3137</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>2139</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>67</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>931</td> </tr> </tbody> </table> [Explore the data](https://app.wandb.ai/wandb/huggingtweets/runs/1cyr2cuz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @javorszky's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://app.wandb.ai/wandb/huggingtweets/runs/2sa503ur) for full transparency and reproducibility. At the end of training, [the final model](https://app.wandb.ai/wandb/huggingtweets/runs/2sa503ur/artifacts) is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/javorszky'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) <section class='prose'> For more details, visit the project repository. </section> [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets) <!--- random size file -->
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/javorszky/1602234108282/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/javorszky
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<link rel="stylesheet" href="URL <style> @media (prefers-color-scheme: dark) { .prose { color: #E2E8F0 !important; } .prose h2, .prose h3, .prose a, .prose thead { color: #F7FAFC !important; } } </style> <section class='prose'> <div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('URL </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Gabor Javorszky AI Bot </div> <div style="font-size: 15px; color: #657786">@javorszky bot</div> </div> I was made with huggingtweets. Create your own bot based on your favorite user with the demo! ## How does it work? The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. ## Training data The model was trained on @javorszky's tweets. <table style='border-width:0'> <thead style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #CBD5E0'> <th style='border-width:0'>Data</th> <th style='border-width:0'>Quantity</th> </tr> </thead> <tbody style='border-width:0'> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Tweets downloaded</td> <td style='border-width:0'>3137</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Retweets</td> <td style='border-width:0'>2139</td> </tr> <tr style='border-width:0 0 1px 0; border-color: #E2E8F0'> <td style='border-width:0'>Short tweets</td> <td style='border-width:0'>67</td> </tr> <tr style='border-width:0'> <td style='border-width:0'>Tweets kept</td> <td style='border-width:0'>931</td> </tr> </tbody> </table> Explore the data, which is tracked with W&B artifacts at every step of the pipeline. ## Training procedure The model is based on a pre-trained GPT-2 which is fine-tuned on @javorszky's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. ## Intended uses & limitations ### How to use You can use this model directly with a pipeline for text generation: <pre><code><span style="color:#03A9F4">from</span> transformers <span style="color:#03A9F4">import</span> pipeline generator = pipeline(<span style="color:#FF9800">'text-generation'</span>, model=<span style="color:#FF9800">'huggingtweets/javorszky'</span>) generator(<span style="color:#FF9800">"My dream is"</span>, num_return_sequences=<span style="color:#8BC34A">5</span>)</code></pre> ### Limitations and bias The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* </section> ![Follow](URL <section class='prose'> For more details, visit the project repository. </section> ![GitHub stars](URL
[ "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @javorszky's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3137</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>2139</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>67</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>931</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @javorszky's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/javorszky'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report.", "## Training data\n\nThe model was trained on @javorszky's tweets.\n\n<table style='border-width:0'>\n<thead style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #CBD5E0'>\n<th style='border-width:0'>Data</th>\n<th style='border-width:0'>Quantity</th>\n</tr>\n</thead>\n<tbody style='border-width:0'>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Tweets downloaded</td>\n<td style='border-width:0'>3137</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Retweets</td>\n<td style='border-width:0'>2139</td>\n</tr>\n<tr style='border-width:0 0 1px 0; border-color: #E2E8F0'>\n<td style='border-width:0'>Short tweets</td>\n<td style='border-width:0'>67</td>\n</tr>\n<tr style='border-width:0'>\n<td style='border-width:0'>Tweets kept</td>\n<td style='border-width:0'>931</td>\n</tr>\n</tbody>\n</table>\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.", "## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on @javorszky's tweets.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.", "## Intended uses & limitations", "### How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n<pre><code><span style=\"color:#03A9F4\">from</span> transformers <span style=\"color:#03A9F4\">import</span> pipeline\ngenerator = pipeline(<span style=\"color:#FF9800\">'text-generation'</span>,\n model=<span style=\"color:#FF9800\">'huggingtweets/javorszky'</span>)\ngenerator(<span style=\"color:#FF9800\">\"My dream is\"</span>, num_return_sequences=<span style=\"color:#8BC34A\">5</span>)</code></pre>", "### Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.", "## About\n\n*Built by Boris Dayma*\n\n</section>\n\n![Follow](URL\n\n<section class='prose'>\nFor more details, visit the project repository.\n</section>\n\n![GitHub stars](URL" ]
[ 57, 34, 431, 76, 9, 168, 48, 58 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n## How does it work?\n\nThe model uses the following pipeline.\n\n!pipeline\n\nTo understand how the model was developed, check the W&B report." ]
[ -0.04084593802690506, 0.035596683621406555, -0.0024457171093672514, 0.04662978649139404, 0.10991521924734116, 0.022836215794086456, 0.12812861800193787, 0.0424627922475338, -0.03746044635772705, -0.03597303107380867, 0.22758877277374268, 0.1009177565574646, 0.03089720755815506, 0.17962171137332916, 0.010350672528147697, -0.2703946828842163, 0.015237200073897839, 0.0647135004401207, -0.07720091193914413, 0.15752871334552765, 0.05562684312462807, -0.049801189452409744, 0.08214939385652542, 0.032038331031799316, -0.165513277053833, -0.004831716418266296, -0.02072383277118206, -0.04504403471946716, 0.09232694655656815, 0.06956911832094193, 0.07011176645755768, 0.034282486885786057, 0.017851393669843674, -0.0714372918009758, 0.06354191154241562, 0.014452377334237099, -0.02349161170423031, 0.13615116477012634, 0.028668763116002083, -0.0002947957837022841, 0.1798527091741562, 0.11815319955348969, 0.016722125932574272, 0.016196802258491516, -0.1166125237941742, -0.0606788769364357, 0.012365915812551975, 0.04470464214682579, 0.10005760192871094, 0.058856748044490814, 0.01943058706820011, 0.13215520977973938, -0.1164829432964325, 0.08512931317090988, 0.1828341782093048, -0.24191047251224518, -0.006823094096034765, 0.061043880879879, 0.08518847078084946, 0.02852868288755417, -0.0027500083670020103, 0.050437480211257935, 0.06484624743461609, 0.021347152069211006, 0.03636380657553673, -0.0638979896903038, 0.05009674280881882, 0.011186900548636913, -0.10215270519256592, -0.07660897821187973, 0.2294245809316635, -0.0602995827794075, 0.004886541981250048, -0.028186868876218796, -0.08490151911973953, -0.06026294082403183, -0.012795685790479183, -0.05738116800785065, -0.01700775697827339, 0.034475523978471756, 0.0214629378169775, -0.10922146588563919, -0.07046429812908173, -0.1146778091788292, -0.09578189998865128, 0.17879855632781982, -0.02900231070816517, 0.08940442651510239, -0.2603394091129303, 0.2254239320755005, 0.07019765675067902, -0.11909467726945877, 0.04757627099752426, -0.11814092099666595, 0.08530338853597641, 0.02207805961370468, 0.041434839367866516, 0.07761925458908081, 0.04161522537469864, 0.11940909177064896, 0.03257475048303604, -0.00314142694696784, 0.05456724017858505, 0.07224375009536743, 0.0941813513636589, 0.1230122447013855, -0.07287998497486115, -0.07873915135860443, 0.08034131675958633, -0.04709449037909508, -0.11787727475166321, -0.06861002743244171, -0.1452382206916809, -0.004202034790068865, -0.0335184708237648, 0.07922724634408951, 0.05637102574110031, 0.09526676684617996, -0.01433930266648531, -0.05777551606297493, 0.04157217592000961, -0.06985066086053848, 0.019995175302028656, -0.011708668433129787, -0.060763102024793625, 0.13121068477630615, 0.04688004031777382, -0.014275136403739452, -0.08673419058322906, 0.07951672375202179, -0.1204570084810257, -0.07978847622871399, -0.08239417523145676, -0.05098734796047211, -0.007686138618737459, -0.11281027644872665, 0.049203939735889435, -0.11458899080753326, -0.22705501317977905, -0.01313747651875019, 0.04550096020102501, -0.016462473198771477, -0.03707117214798927, -0.040630146861076355, -0.009473399259150028, 0.04880702868103981, -0.042894408106803894, 0.039052072912454605, -0.05325083062052727, 0.050018906593322754, -0.09618895500898361, 0.051321301609277725, -0.10470817238092422, 0.041251372545957565, -0.09671807289123535, 0.07595963031053543, 0.0017866486450657248, 0.0454767569899559, 0.010063555091619492, 0.08985432982444763, -0.03214199095964432, -0.044836416840553284, -0.07869677990674973, 0.026561295613646507, 0.02284199558198452, 0.20134314894676208, -0.10196486860513687, -0.0819794163107872, 0.12614667415618896, -0.07252102345228195, -0.1288122832775116, 0.0409054271876812, -0.02831157296895981, 0.17958028614521027, 0.07370224595069885, 0.16096454858779907, 0.12039237469434738, -0.03798284754157066, 0.1263856589794159, 0.14821170270442963, -0.1269366294145584, -0.004153167363256216, 0.039416827261447906, 0.014042570255696774, -0.20559121668338776, 0.04052022844552994, -0.01946125738322735, 0.06584736704826355, -0.10201486945152283, -0.00946728140115738, 0.0031527038663625717, -0.02320384420454502, 0.0023026331327855587, -0.0472555048763752, 0.061078161001205444, 0.038737643510103226, -0.022763127461075783, 0.029682613909244537, 0.06635552644729614, -0.026160234585404396, -0.00823320634663105, -0.04484035447239876, 0.10009831935167313, -0.07260703295469284, 0.06797321885824203, -0.13160665333271027, -0.003086181590333581, -0.012450510635972023, 0.0972909927368164, 0.03811546042561531, 0.10621625185012817, 0.05470104143023491, 0.03126294165849686, 0.07701993733644485, -0.02846951223909855, 0.0746634304523468, 0.01002051867544651, -0.0850096121430397, -0.1272277683019638, 0.015217535197734833, -0.10649837553501129, -0.004191380459815264, -0.08523700386285782, -0.0015737697249278426, -0.12802091240882874, 0.05826076865196228, -0.016465239226818085, 0.0539434514939785, -0.055182818323373795, -0.04337453469634056, -0.044662054628133774, -0.022007398307323456, 0.0482511967420578, -0.033139705657958984, -0.06687536835670471, 0.16832175850868225, -0.15395553410053253, 0.2592274248600006, 0.1313537061214447, -0.0889907032251358, 0.002020547864958644, -0.07812081277370453, -0.04194976016879082, -0.012868959456682205, 0.07527336478233337, -0.03700125962495804, 0.15495356917381287, -0.03386903926730156, 0.17382068932056427, -0.099679134786129, -0.0334031768143177, 0.02144831046462059, -0.10981054604053497, 0.057656705379486084, 0.08051177859306335, 0.04427298903465271, -0.159133642911911, 0.08837400376796722, 0.19756358861923218, 0.05472150072455406, 0.20321963727474213, -0.006869805511087179, -0.06112205237150192, -0.05358194187283516, -0.0808696523308754, -0.052568696439266205, 0.056259434670209885, -0.09903126955032349, -0.004000484477728605, 0.06423190236091614, 0.08783505111932755, 0.11237549781799316, -0.10904275625944138, -0.046180129051208496, 0.05125856027007103, -0.004819708876311779, -0.051060013473033905, 0.07006146013736725, -0.0659489631652832, 0.13217470049858093, 0.014598124660551548, -0.07049204409122467, 0.0036125897895544767, -0.004913401324301958, -0.11891388893127441, 0.20653130114078522, -0.08047540485858917, -0.27306002378463745, -0.16792123019695282, -0.16288253664970398, 0.07165426760911942, 0.038431257009506226, 0.033738572150468826, -0.08776884526014328, -0.020982403308153152, 0.004409478977322578, 0.11553267389535904, -0.09698133170604706, 0.013121976517140865, 0.008159824647009373, -0.018650712445378304, -0.07579360157251358, -0.09033482521772385, -0.0241270512342453, -0.02461584471166134, 0.020020704716444016, 0.03998296707868576, -0.11154978722333908, 0.06758414953947067, 0.2167699933052063, -0.015538511797785759, 0.06870997697114944, 0.00025148785789497197, 0.26176807284355164, -0.08426473289728165, 0.040830448269844055, 0.11926601082086563, -0.08760137856006622, 0.05199241638183594, 0.07132956385612488, 0.03210015222430229, -0.014074578881263733, 0.016441889107227325, -0.11233895272016525, -0.12864868342876434, -0.1923626959323883, -0.06961654871702194, -0.028241310268640518, 0.13464264571666718, 0.031150488182902336, 0.04321796074509621, 0.10346641391515732, 0.07471037656068802, 0.06701335310935974, 0.03259968012571335, -0.0005120337591506541, 0.0647427961230278, 0.024594781920313835, -0.05812343955039978, 0.054217349737882614, -0.04845457896590233, -0.0797470211982727, 0.08279551565647125, -0.011098933406174183, 0.0927528515458107, 0.06928195804357529, 0.02340286411345005, 0.018686039373278618, 0.04218229651451111, 0.15593960881233215, 0.22442668676376343, -0.012412761338055134, -0.041085485368967056, -0.05078154057264328, -0.040494389832019806, -0.01600850187242031, 0.015044075436890125, -0.05785144492983818, -0.033252447843551636, -0.0728597640991211, -0.015066487714648247, 0.011195010505616665, 0.015441779047250748, 0.07578693330287933, -0.22024130821228027, -0.038240667432546616, 0.042616840451955795, -0.013794191181659698, -0.10639895498752594, 0.05872863903641701, 0.016779562458395958, -0.17391349375247955, -0.07854076474905014, -0.016605399549007416, 0.1603294163942337, -0.030760308727622032, 0.0619782954454422, 0.005449770484119654, 0.02271227352321148, -0.013140208087861538, 0.11191333085298538, -0.27346712350845337, 0.1954270750284195, 0.001131516881287098, -0.04876048117876053, -0.016439033672213554, -0.04243995249271393, 0.0009058643481694162, 0.14556926488876343, 0.09718295931816101, 0.0028763783629983664, 0.0669604167342186, -0.07678256928920746, -0.11943262070417404, 0.05284353718161583, 0.08068333566188812, -0.07738065719604492, 0.029960619285702705, -0.029798466712236404, 0.027152907103300095, -0.007555682212114334, -0.030231619253754616, 0.002119861776009202, -0.11661309748888016, 0.02936525270342827, -0.08075195550918579, 0.06012337654829025, 0.02433968149125576, -0.02529163844883442, -0.012048180215060711, 0.1316436529159546, -0.013300766237080097, -0.08264251798391342, -0.08976204693317413, -0.02328740619122982, 0.09523095935583115, -0.05599937587976456, 0.03358715400099754, -0.08175740391016006, -0.04073614999651909, 0.005860272329300642, -0.16970814764499664, 0.06983034312725067, -0.10846570879220963, -0.09971687942743301, -0.050264790654182434, 0.15346404910087585, 0.013677009381353855, 0.025709833949804306, 0.03220117464661598, -0.04211581498384476, -0.18150363862514496, -0.15989434719085693, -0.007562890648841858, 0.0717545747756958, -0.04433317109942436, 0.03638565540313721, 0.007171243894845247, 0.10013602674007416, 0.004198792390525341, 0.07230839878320694, 0.2026015669107437, 0.16423118114471436, -0.08760133385658264, 0.17723721265792847, 0.16266676783561707, -0.12243213504552841, -0.2722402811050415, -0.09522651135921478, -0.05925937369465828, 0.03468820080161095, 0.02297091670334339, -0.13072867691516876, 0.06184706464409828, -0.011241482570767403, -0.004976592492312193, 0.13391432166099548, -0.2790721356868744, -0.07025358080863953, 0.13864430785179138, -0.012145180255174637, 0.2560276985168457, -0.042459286749362946, -0.08155408501625061, -0.060940731316804886, -0.2339130938053131, 0.1595010906457901, -0.12908293306827545, 0.030256805941462517, -0.06380902975797653, 0.1317017376422882, 0.04475972056388855, -0.051817599684000015, 0.13714583218097687, -0.0770399421453476, 0.03692200407385826, -0.1231972947716713, -0.01437266543507576, 0.05212629213929176, -0.014681367203593254, 0.10554680228233337, -0.053141020238399506, 0.10400939732789993, -0.12106935679912567, -0.052672889083623886, -0.054288461804389954, 0.017598338425159454, -0.023758167400956154, -0.05668776109814644, -0.039483629167079926, -0.05230721831321716, 0.00942184031009674, -0.024894973263144493, -0.008981208316981792, -0.02189256064593792, 0.08200293034315109, 0.10853444039821625, 0.1416669338941574, -0.04508063197135925, -0.02666328102350235, -0.029412275180220604, -0.043095141649246216, 0.07755832374095917, -0.1675589680671692, -0.020979177206754684, 0.15767353773117065, 0.008264025673270226, 0.08081416040658951, 0.07994852215051651, -0.043529048562049866, -0.04116993397474289, 0.09435915946960449, -0.23738352954387665, -0.032961416989564896, -0.07289689034223557, -0.032304681837558746, 0.05143286660313606, 0.06389017403125763, 0.11233682930469513, -0.055076416581869125, -0.015500548295676708, 0.038369257003068924, -0.013473432511091232, -0.10457789897918701, 0.12659704685211182, 0.07594829052686691, 0.04931824654340744, -0.13000807166099548, 0.03979043290019035, -0.02080575004220009, -0.024042857810854912, -0.009190280921757221, 0.09610513597726822, -0.13868926465511322, -0.061987441033124924, 0.01100219041109085, 0.1624082624912262, -0.08940329402685165, -0.054934311658144, -0.00678250240162015, -0.07782098650932312, 0.06215988099575043, 0.06269455701112747, 0.039047662168741226, 0.10006190836429596, -0.08492296934127808, -0.004345493856817484, -0.04427671059966087, 0.02742549031972885, 0.04004936292767525, -0.01839151792228222, -0.11644710600376129, 0.050648268312215805, 0.01261399406939745, 0.21786263585090637, -0.12195943295955658, -0.07748695462942123, -0.13975368440151215, 0.03579137846827507, -0.1441981941461563, -0.02782432734966278, -0.09455464035272598, -0.0542730838060379, -0.024786408990621567, -0.02354593575000763, -0.05044161155819893, -0.03595460206270218, -0.06568260490894318, 0.04963921010494232, -0.01889806240797043, -0.04201965406537056, -0.018809955567121506, 0.04780932888388634, 0.10624072700738907, -0.0022816911805421114, 0.11582330614328384, 0.10476028919219971, -0.06149300932884216, 0.06964143365621567, -0.08975338935852051, 0.049342647194862366, 0.010800108313560486, -0.03639211133122444, 0.07890737056732178, 0.033158838748931885, 0.011678727343678474, -0.02014644630253315, -0.05248590186238289, 0.015699470415711403, 0.019494805485010147, -0.09001129865646362, 0.04338252544403076, 0.03427375108003616, -0.07128193974494934, -0.06945458799600601, -0.02831537090241909, -0.04915383830666542, 0.10966338962316513, 0.09227382391691208, 0.01580313965678215, 0.11524862796068192, -0.09982031583786011, -0.0043287696316838264, 0.0288130734115839, -0.08074736595153809, -0.01706261746585369, -0.10044533759355545, -0.01304725930094719, -0.02274717018008232, 0.2554529011249542, 0.12089171260595322, -0.025309232994914055, -0.03230812028050423, 0.07114472985267639, 0.08105676621198654, -0.0211230106651783, 0.14824873208999634, 0.03444083034992218, -0.0007331980159506202, -0.1400776505470276, 0.10673409700393677, -0.060156434774398804, -0.010151425376534462, 0.09550673514604568, -0.08319920301437378, 0.048856139183044434, 0.07468824833631516, -0.01950058713555336, 0.05372466519474983, -0.11716536432504654, -0.2690386474132538, 0.023945249617099762, 0.027653370052576065, -0.0441947840154171, 0.07253700494766235, 0.145015150308609, 0.00042942073196172714, 0.05244648456573486, 0.061493102461099625, -0.05709811672568321, -0.17804701626300812, -0.19115881621837616, -0.0384756401181221, -0.11082857847213745, -0.023826930671930313, -0.10639674216508865, 0.04148538038134575, -0.02072913944721222, 0.05925795063376427, -0.09639845043420792, 0.12447383254766464, 0.06843417137861252, -0.11577396094799042, 0.05810433253645897, -0.008805959485471249, 0.048459235578775406, -0.07387776672840118, 0.08210063725709915, -0.10721065104007721, -0.026499031111598015, -0.016933415085077286, 0.03711435943841934, -0.05858420953154564, 0.0011270071845501661, -0.10357651859521866, -0.06808403134346008, -0.056935109198093414, 0.09072309732437134, -0.024477418512105942, 0.03998230770230293, -0.014557241462171078, -0.061277762055397034, -0.025446701794862747, 0.2273169606924057, -0.018587565049529076, -0.043939489871263504, -0.0661960318684578, 0.2851298749446869, -0.06544138491153717, 0.07253559678792953, -0.032977886497974396, -0.001274158013984561, -0.07127615064382553, 0.2931469976902008, 0.36314713954925537, -0.14264726638793945, 0.011796033009886742, -0.018389053642749786, 0.03556118905544281, 0.07535336911678314, 0.18024654686450958, 0.07291083037853241, 0.3107033371925354, -0.04080776497721672, -0.01225926261395216, -0.10546047985553741, -0.03835856914520264, 0.014304363168776035, 0.02947218343615532, 0.08378855139017105, -0.05586446449160576, -0.06808875501155853, 0.1039084792137146, -0.26703301072120667, -0.02056516334414482, -0.16380304098129272, -0.061613935977220535, -0.04166705906391144, 0.0007227687747217715, 0.07237391918897629, 0.028740311041474342, 0.05115301162004471, -0.039005450904369354, -0.047156207263469696, 0.057444483041763306, -0.02154913917183876, -0.12674635648727417, 0.0002557095722295344, 0.143532857298851, -0.07906237244606018, -0.0018181405030190945, 0.0032308290246874094, 0.060348983854055405, 0.044118594378232956, 0.04119637981057167, -0.10164451599121094, 0.02608482725918293, 0.01246592216193676, -0.03363148868083954, -0.028164468705654144, 0.008156497962772846, 0.07835527509450912, -0.21697945892810822, 0.0020338469184935093, -0.14078554511070251, 0.011757226660847664, -0.07641053944826126, -0.006896127946674824, -0.08222074061632156, 0.03242125362157822, 0.004625517874956131, 0.1118803396821022, 0.11125602573156357, -0.03202005848288536, -0.0006144302315078676, -0.06265610456466675, 0.06727221608161926, -0.06884542852640152, -0.02960195019841194, -0.025150567293167114, -0.09257599711418152, -0.09335606545209885, 0.09815482050180435, -0.022339481860399246, -0.1427105814218521, 0.007601875811815262, -0.09401176869869232, -0.04369132220745087, -0.021486658602952957, 0.09382037818431854, 0.11086808145046234, 0.09180203825235367, -0.007599277421832085, 0.047748953104019165, 0.03120456263422966, 0.07436691224575043, -0.12886843085289001, -0.10148585587739944 ]
null
null
transformers
<div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1325460517922729984/xDO9dBt-_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jay Alammar</div> <div style="text-align: center; font-size: 14px;">@jayalammar</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jay Alammar. | Data | Jay Alammar | | --- | --- | | Tweets downloaded | 692 | | Retweets | 198 | | Short tweets | 35 | | Tweets kept | 459 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1wf3zug3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jayalammar's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/hq8g8xlh) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/hq8g8xlh/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jayalammar') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "http://www.huggingtweets.com/jayalammar/1638460288971/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jayalammar
[ "transformers", "pytorch", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
AI BOT Jay Alammar @jayalammar I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on tweets from Jay Alammar. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jayalammar's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 54 ]
[ "passage: TAGS\n#transformers #pytorch #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ -0.0001764487533364445, -0.01891571842133999, -0.0068881697952747345, 0.01242890115827322, 0.16224369406700134, 0.04406825825572014, 0.08452208340167999, 0.14250440895557404, -0.026455026119947433, -0.016114573925733566, 0.17334569990634918, 0.17106501758098602, -0.014037161134183407, 0.08718273043632507, -0.05552244931459427, -0.2646014094352722, 0.044212065637111664, 0.058431971818208694, -0.020032864063978195, 0.14111687242984772, 0.0714879110455513, -0.01828647032380104, 0.10845158249139786, -0.02953636273741722, -0.18877948820590973, 0.03499612212181091, 0.05605728179216385, -0.09992336481809616, 0.11936124414205551, 0.04713086411356926, 0.08659981191158295, 0.015409729443490505, -0.07482189685106277, -0.12249794602394104, 0.03866785019636154, 0.041891295462846756, -0.0629689022898674, 0.05911329388618469, 0.08818957209587097, -0.11120674759149551, 0.1456013321876526, 0.07799911499023438, -0.01863223686814308, 0.07941857725381851, -0.17164963483810425, -0.019008882343769073, -0.036983806639909744, 0.005464354529976845, 0.057414017617702484, 0.07476010918617249, -0.01932354085147381, 0.1732589453458786, -0.0767555758357048, 0.09587065875530243, 0.16117197275161743, -0.2913956344127655, -0.005072304047644138, 0.0498935841023922, 0.06722559779882431, 0.03902119770646095, -0.015738610178232193, 0.08706029504537582, 0.06277379393577576, 0.02536560781300068, -0.0014978112885728478, -0.06339331716299057, -0.0928240567445755, 0.04014677554368973, -0.0745372325181961, -0.06578013300895691, 0.20811960101127625, -0.039430033415555954, 0.050536420196294785, -0.03807967156171799, -0.10491003096103668, -0.02197154052555561, -0.015753688290715218, 0.00720712635666132, -0.06031509116292, 0.08949250727891922, -0.014659170061349869, -0.07363511621952057, -0.15025688707828522, -0.016003666445612907, -0.18369624018669128, 0.1574522852897644, 0.003967532888054848, 0.04864242300391197, -0.2093716263771057, 0.11408322304487228, 0.020872395485639572, -0.08021171391010284, 0.047296058386564255, -0.09546594321727753, 0.07179957628250122, 0.002116252202540636, -0.05267130210995674, -0.02264278009533882, 0.08682441711425781, 0.15258505940437317, -0.026140356436371803, 0.0017382961232215166, -0.027155736461281776, 0.07059825956821442, 0.05281682312488556, 0.040018972009420395, -0.017763059586286545, -0.04289618879556656, 0.045782025903463364, -0.15945611894130707, -0.007582054473459721, -0.06918198615312576, -0.1068587526679039, -0.05112413689494133, 0.022331949323415756, 0.06442617624998093, 0.031878020614385605, 0.11345649510622025, -0.04609488695859909, -0.014168480411171913, 0.06350603699684143, -0.042460083961486816, -0.0172110628336668, -0.016047241166234016, 0.015319211408495903, 0.14095398783683777, -0.018181998282670975, 0.030711950734257698, -0.11251416057348251, 0.0645761713385582, -0.09946728497743607, -0.01914660818874836, -0.0071241967380046844, -0.04078202322125435, 0.029947424307465553, -0.13505136966705322, 0.010975461453199387, -0.1711256355047226, -0.14874492585659027, 0.009897243231534958, -0.02876151353120804, -0.018498392775654793, -0.059200938791036606, -0.03973992541432381, -0.01719699800014496, 0.06084148958325386, -0.04528016224503517, 0.0009662628290243447, -0.05951235070824623, 0.11524756252765656, -0.05245914310216904, 0.06888316571712494, -0.13250301778316498, 0.05976588651537895, -0.15375985205173492, -0.00870948750525713, -0.04503806680440903, 0.08287355303764343, 0.017666861414909363, 0.16653785109519958, -0.006960130762308836, -0.012972959317266941, -0.09847598522901535, 0.06441020965576172, -0.023426895961165428, 0.24133971333503723, -0.06349262595176697, -0.143473818898201, 0.22233937680721283, -0.06944137066602707, -0.1420930027961731, 0.12657590210437775, 0.020838137716054916, 0.07354387640953064, 0.10204131156206131, 0.19502143561840057, 0.014321080408990383, 0.006638950202614069, 0.054760824888944626, 0.0820525735616684, -0.17468082904815674, -0.03090454451739788, -0.00916894432157278, -0.01691337674856186, -0.1358218789100647, 0.043003637343645096, 0.11889315396547318, 0.10293904691934586, -0.07099705934524536, -0.013554582372307777, -0.03317642584443092, -0.004439891315996647, 0.06940841674804688, -0.007896981202065945, 0.09806080162525177, -0.09929461032152176, -0.040301576256752014, -0.058301206678152084, -0.006890235934406519, 0.0024084753822535276, 0.04120192304253578, -0.040066637098789215, 0.10096675902605057, -0.0006905568297952414, 0.05225489288568497, -0.14046427607536316, -0.07798092812299728, -0.020940499380230904, 0.1575685739517212, 0.04120592027902603, 0.04776391014456749, 0.057414863258600235, -0.0481991246342659, -0.0158701092004776, -0.009968787431716919, 0.16312187910079956, -0.0394844189286232, -0.06968989968299866, -0.056453973054885864, 0.10656707733869553, -0.058365050703287125, 0.03222460299730301, -0.04202231392264366, 0.022449776530265808, 0.060929615050554276, 0.12114907056093216, -0.0026538248639553785, 0.029062092304229736, -0.01023187953978777, -0.0038288652431219816, -0.07459788024425507, -0.020947640761733055, 0.10031349956989288, -0.004600553773343563, -0.08327498286962509, 0.23685328662395477, -0.17454755306243896, 0.19663569331169128, 0.2115958034992218, -0.2628093659877777, -0.024321777746081352, -0.06840608268976212, -0.05017746612429619, 0.003011465771123767, 0.05876095965504646, -0.04692309722304344, 0.09809201210737228, -0.02521132305264473, 0.16497591137886047, -0.04652651026844978, -0.07362692058086395, 0.016456644982099533, -0.05898859724402428, -0.0463954322040081, 0.0659271627664566, 0.08106416463851929, -0.15480613708496094, 0.18694530427455902, 0.20838376879692078, 0.07612863928079605, 0.19334357976913452, 0.004058185499161482, -0.014812501147389412, 0.08005014806985855, -0.03805047646164894, -0.04202093929052353, -0.07553261518478394, -0.16944189369678497, -0.01902174763381481, 0.07485251128673553, 0.03750864416360855, 0.11274250596761703, -0.10172852873802185, -0.07372885197401047, -0.016179129481315613, -0.005032413639128208, 0.005167648661881685, 0.1174640879034996, 0.045775800943374634, 0.14043675363063812, -0.019972821697592735, 0.03493902459740639, 0.08747350424528122, 0.02448674477636814, -0.10759711265563965, 0.16407065093517303, -0.14081640541553497, -0.38538745045661926, -0.16212545335292816, -0.13394121825695038, -0.029274288564920425, 0.04825805127620697, 0.11038947850465775, -0.13598975539207458, 0.0011978530092164874, -0.003706524148583412, 0.12342415004968643, -0.0806080624461174, 0.03755999356508255, -0.07838296890258789, 0.026997538283467293, -0.06349453330039978, -0.07917723804712296, -0.036463622003793716, -0.03232228383421898, -0.10000553727149963, 0.1757805496454239, -0.11054177582263947, 0.057571277022361755, 0.1741490364074707, 0.022440658882260323, 0.034390855580568314, -0.0513761006295681, 0.17275545001029968, -0.11779367178678513, 0.02093288116157055, 0.16278521716594696, -0.01799617148935795, 0.08678310364484787, 0.08059167861938477, -0.015366556122899055, -0.10777527838945389, 0.05196633189916611, 0.0019955262541770935, -0.1096891239285469, -0.20052047073841095, -0.12150565534830093, -0.0784008651971817, 0.14483654499053955, 0.05303339660167694, 0.05915789678692818, 0.17167222499847412, 0.08591149002313614, -0.04288473725318909, -0.004711467772722244, -0.012867298908531666, 0.07781979441642761, 0.1684085726737976, -0.017248503863811493, 0.11789125204086304, -0.05446818470954895, -0.11601924896240234, 0.13826869428157806, 0.02504623495042324, 0.050291191786527634, 0.04182872176170349, 0.008374262601137161, -0.009610554203391075, 0.09969738125801086, 0.12988939881324768, 0.118865467607975, -0.008107239380478859, -0.0232877004891634, -0.03601100295782089, -0.00860752072185278, -0.03570752218365669, 0.034572016447782516, 0.011757065542042255, -0.16013272106647491, -0.05726486071944237, -0.12173470109701157, 0.0964120477437973, 0.09787409752607346, 0.08039643615484238, -0.2033146470785141, -0.004589703865349293, 0.07378163933753967, -0.03603411093354225, -0.11624246090650558, 0.086527980864048, 0.033109065145254135, -0.1271866261959076, 0.0817195475101471, -0.03352120518684387, 0.115711510181427, -0.017423994839191437, 0.09427224844694138, -0.04346824064850807, -0.0329415462911129, -0.012381686829030514, 0.10430185496807098, -0.30799204111099243, 0.17485815286636353, -0.019660785794258118, -0.07034741342067719, -0.07672256976366043, -0.025566134601831436, 0.017929747700691223, 0.07530300319194794, 0.09619415551424026, 0.024311896413564682, 0.04642496258020401, -0.09243860840797424, -0.03940937668085098, 0.034113768488168716, 0.13641610741615295, -0.0638844221830368, -0.015862328931689262, -0.04075292870402336, 0.01116214320063591, -0.019626103341579437, -0.027355113998055458, 0.018256209790706635, -0.1504947543144226, 0.05358212813735008, 0.017237937077879906, 0.0753018707036972, 0.03889141231775284, -0.007973386906087399, -0.10224062204360962, 0.18268363177776337, -0.03004412353038788, -0.08521177619695663, -0.127006396651268, -0.04812724515795708, 0.04587927460670471, -0.051474425941705704, 0.034381575882434845, -0.06691597402095795, -0.011345877312123775, -0.06886660307645798, -0.21483656764030457, 0.12495172768831253, -0.0775398537516594, -0.07874035835266113, -0.03474915772676468, 0.20981398224830627, -0.05076101794838905, -0.00018431349599268287, 0.01172169204801321, 0.014822970144450665, -0.1086968258023262, -0.10466268658638, 0.06874550879001617, -0.034708425402641296, 0.02743770368397236, 0.02760813757777214, -0.03700246661901474, 0.02073092758655548, -0.06074898689985275, -0.01314478274434805, 0.2849438786506653, 0.22848764061927795, -0.035804633051157, 0.1875685602426529, 0.10711772739887238, -0.07248730212450027, -0.30828598141670227, -0.0999293103814125, -0.13133330643177032, -0.033457282930612564, -0.02052777260541916, -0.17081500589847565, 0.07089676707983017, 0.04558560997247696, 0.00998434517532587, 0.14900504052639008, -0.21140912175178528, -0.08518896996974945, 0.1092359870672226, -0.03466132655739784, 0.42140644788742065, -0.1164151057600975, -0.09652433544397354, -0.05340435355901718, -0.15239587426185608, 0.20086675882339478, -0.01557826716452837, 0.08761543780565262, -0.031178412958979607, 0.14360429346561432, 0.04995222017168999, -0.01680157333612442, 0.08329997956752777, 0.0014065488940104842, 0.004026432521641254, -0.12653036415576935, -0.022627411410212517, 0.0504734143614769, 0.021120509132742882, 0.0054380460642278194, -0.07736434042453766, 0.028706049546599388, -0.14863882958889008, -0.024921666830778122, -0.10909338295459747, 0.08278775960206985, 0.03857516124844551, -0.07422378659248352, -0.010892878286540508, -0.05615931376814842, -0.023589344695210457, -0.012528443709015846, 0.13453009724617004, -0.050522636622190475, 0.1720355898141861, 0.036824267357587814, 0.11522943526506424, -0.13816054165363312, 0.06146138161420822, -0.07406572997570038, -0.07532623410224915, 0.06773588061332703, -0.13577620685100555, 0.05240122601389885, 0.10075365751981735, -0.03372569754719734, 0.04758675396442413, 0.08927652984857559, 0.000919255951885134, 0.009403540752828121, 0.15953567624092102, -0.2761637568473816, 0.01791755110025406, -0.07046908140182495, -0.07692829519510269, 0.112159863114357, 0.07484955340623856, 0.181244894862175, 0.02791808731853962, -0.0472177192568779, 0.012221097014844418, 0.019269011914730072, -0.05120278522372246, 0.0548672117292881, 0.006578541360795498, -0.011816216632723808, -0.14148055016994476, 0.08790901303291321, -0.0015104453777894378, -0.1429632604122162, 0.02210722118616104, 0.19665491580963135, -0.13288317620754242, -0.10024755448102951, -0.05081937462091446, 0.057288672775030136, -0.13861924409866333, 0.008680099621415138, -0.01990172080695629, -0.09562872350215912, 0.0756591260433197, 0.1573420912027359, 0.05079081282019615, 0.12963363528251648, -0.02916029281914234, -0.008411908522248268, -0.04279141128063202, -0.051675889641046524, 0.02788730151951313, 0.019720058888196945, -0.07752392441034317, 0.08735781908035278, -0.024015765637159348, 0.14329436421394348, -0.10051412135362625, -0.06987810134887695, -0.1344141960144043, -0.005019306670874357, -0.09607285261154175, -0.0959741547703743, -0.08079583197832108, -0.061935000121593475, 0.004180733114480972, -0.039079975336790085, -0.0417400486767292, -0.08063078671693802, -0.10278100520372391, 0.016344387084245682, -0.02737213484942913, 0.028547246009111404, -0.07021904736757278, 0.008450011722743511, 0.12212047725915909, -0.029432062059640884, 0.17478644847869873, 0.15045183897018433, -0.10443241149187088, 0.10585668683052063, -0.17528948187828064, -0.10057486593723297, 0.10386897623538971, -0.01424362976104021, 0.02509693056344986, 0.12928596138954163, 0.018586233258247375, 0.04242280498147011, 0.03278495371341705, 0.06450606882572174, 0.04438134282827377, -0.11879310011863708, 0.08265755325555801, -0.002198860514909029, -0.15595629811286926, -0.061124756932258606, -0.09235648810863495, 0.02707000821828842, 0.02035105973482132, 0.11072126775979996, -0.04177020862698555, 0.08901136368513107, -0.06589431315660477, 0.023006385192275047, 0.025188656523823738, -0.17641755938529968, -0.03692740947008133, -0.05073147267103195, 0.03235582634806633, 0.028527792543172836, 0.22216679155826569, 0.015049049630761147, -0.030550595372915268, 0.0387740358710289, 0.12439186871051788, 0.015807392075657845, -0.0005460345419123769, 0.1709187924861908, 0.10334094613790512, -0.07364179939031601, -0.14011329412460327, 0.0676020160317421, 0.012262817472219467, -0.05599943920969963, 0.11458845436573029, -0.01812131516635418, -0.006866521667689085, 0.0670672059059143, -0.019556820392608643, 0.03449620306491852, -0.06607513129711151, -0.12762659788131714, -0.02478908933699131, 0.04269528388977051, 0.0029638311825692654, 0.12794137001037598, 0.15664103627204895, -0.005168106406927109, 0.026651626452803612, -0.015845872461795807, -0.023246217519044876, -0.13421551883220673, -0.15646639466285706, -0.06222613900899887, -0.14998294413089752, 0.022462334483861923, -0.08663632720708847, 0.047867026180028915, 0.05302877724170685, 0.07266844809055328, -0.06541765481233597, 0.06867647916078568, 0.049353908747434616, -0.11939282715320587, 0.08969518542289734, -0.024748487398028374, 0.04455447196960449, -0.0018985075876116753, -0.03077637031674385, -0.10359742492437363, 0.042003192007541656, -0.01649407483637333, 0.045346152037382126, -0.04421991854906082, 0.022579338401556015, -0.1750071942806244, -0.10995419323444366, -0.04818427562713623, 0.06838665902614594, -0.06683412194252014, 0.04056481271982193, 0.01606888137757778, 0.011690732091665268, 0.030521972104907036, 0.22083373367786407, -0.041100796312093735, -0.04384056478738785, -0.041319385170936584, 0.1635150909423828, -0.014866671524941921, 0.08733032643795013, -0.027159888297319412, -0.008534550666809082, -0.09159335494041443, 0.3551705777645111, 0.29626867175102234, -0.08988689631223679, 0.018322573974728584, -0.023403000086545944, 0.04054094851016998, 0.13749836385250092, 0.13930056989192963, 0.09693139791488647, 0.23870247602462769, -0.069560706615448, -0.05235563591122627, -0.01810343936085701, -0.013992605730891228, -0.06698887050151825, 0.09740167111158371, 0.02308662235736847, -0.05992421507835388, -0.0399320125579834, 0.09157036244869232, -0.2363831102848053, 0.1119081899523735, -0.1049022525548935, -0.15424089133739471, -0.03413557633757591, 0.011207441799342632, 0.07758733630180359, 0.01318280678242445, 0.11218693852424622, 0.013466300442814827, -0.0844072625041008, 0.014990749768912792, 0.034041877835989, -0.25270384550094604, 0.004782035481184721, 0.053680628538131714, -0.12391778826713562, -0.004765757359564304, -0.025234686210751534, 0.013625700026750565, 0.05700398609042168, 0.04094824194908142, -0.03069460764527321, 0.016096249222755432, -0.006652043666690588, -0.020144162699580193, -0.008288858458399773, 0.05953062325716019, 0.04713095352053642, -0.1559005230665207, 0.06380777060985565, -0.13577982783317566, 0.04075455665588379, -0.023885579779744148, -0.012463473714888096, -0.0012933483812958002, 0.01776987873017788, -0.0522073432803154, 0.06581555306911469, 0.07255802303552628, -0.0144086554646492, 0.008549283258616924, -0.08346639573574066, -0.034440234303474426, -0.023841137066483498, -0.10951124131679535, -0.08404874801635742, -0.13158579170703888, -0.12121031433343887, 0.10822955518960953, -0.02819518744945526, -0.18475614488124847, 0.03248962014913559, -0.12301548570394516, 0.05860345810651779, -0.17328192293643951, 0.11332228779792786, 0.07014153152704239, 0.018578365445137024, 0.01264720968902111, -0.007008096668869257, 0.08073210716247559, 0.11507359892129898, -0.0813264548778534, -0.08199906349182129 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/1345079087337938944/tUHfuOi2_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jasmine Persephone ☭ Black Podcast Revolution 🤖 AI Bot </div> <div style="font-size: 15px">@jazzpomegranate bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://app.wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-model-to-generate-tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jazzpomegranate's tweets](https://twitter.com/jazzpomegranate). | Data | Quantity | | --- | --- | | Tweets downloaded | 3208 | | Retweets | 184 | | Short tweets | 720 | | Tweets kept | 2304 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/312m9owm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jazzpomegranate's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/jvni6p8a) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/jvni6p8a/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jazzpomegranate') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jazzpomegranate/1614106581220/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jazzpomegranate
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jasmine Persephone Black Podcast Revolution AI Bot @jazzpomegranate bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jazzpomegranate's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jazzpomegranate's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/63826202/jon_buffalo_400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jon Beasley-Murray 🤖 AI Bot </div> <div style="font-size: 15px">@jbmurray bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jbmurray's tweets](https://twitter.com/jbmurray). | Data | Quantity | | --- | --- | | Tweets downloaded | 2861 | | Retweets | 364 | | Short tweets | 260 | | Tweets kept | 2237 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1yppksvx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jbmurray's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/dqw1zvsq) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/dqw1zvsq/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jbmurray') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jbmurray/1617246417542/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jbmurray
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jon Beasley-Murray AI Bot @jbmurray bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jbmurray's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jbmurray's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]
null
null
transformers
<div> <div style="width: 132px; height:132px; border-radius: 50%; background-size: cover; background-image: url('https://pbs.twimg.com/profile_images/948678870990954496/5moZ7K0__400x400.jpg')"> </div> <div style="margin-top: 8px; font-size: 19px; font-weight: 800">Jordan Peterson Quotes 🤖 AI Bot </div> <div style="font-size: 15px">@jbpetersonquote bot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on [@jbpetersonquote's tweets](https://twitter.com/jbpetersonquote). | Data | Quantity | | --- | --- | | Tweets downloaded | 1983 | | Retweets | 605 | | Short tweets | 47 | | Tweets kept | 1331 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1n1ihdfe/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jbpetersonquote's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1qijh16v) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1qijh16v/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jbpetersonquote') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
{"language": "en", "tags": ["huggingtweets"], "thumbnail": "https://www.huggingtweets.com/jbpetersonquote/1620104584619/predictions.png", "widget": [{"text": "My dream is"}]}
text-generation
huggingtweets/jbpetersonquote
[ "transformers", "pytorch", "jax", "gpt2", "text-generation", "huggingtweets", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
2022-03-02T23:29:05+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Jordan Peterson Quotes AI Bot @jbpetersonquote bot I was made with huggingtweets. Create your own bot based on your favorite user with the demo! How does it work? ----------------- The model uses the following pipeline. !pipeline To understand how the model was developed, check the W&B report. Training data ------------- The model was trained on @jbpetersonquote's tweets. Explore the data, which is tracked with W&B artifacts at every step of the pipeline. Training procedure ------------------ The model is based on a pre-trained GPT-2 which is fine-tuned on @jbpetersonquote's tweets. Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility. At the end of training, the final model is logged and versioned. How to use ---------- You can use this model directly with a pipeline for text generation: Limitations and bias -------------------- The model suffers from the same limitations and bias as GPT-2. In addition, the data present in the user's tweets further affects the text generated by the model. About ----- *Built by Boris Dayma* ![Follow](URL For more details, visit the project repository. ![GitHub stars](URL
[]
[ "TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 57 ]
[ "passage: TAGS\n#transformers #pytorch #jax #gpt2 #text-generation #huggingtweets #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
[ 0.004547144751995802, -0.006708405911922455, -0.007013476919382811, 0.01947171241044998, 0.15818242728710175, 0.03448796644806862, 0.08709780126810074, 0.15389476716518402, -0.019877297803759575, -0.022431448101997375, 0.18047170341014862, 0.173692986369133, -0.012988686561584473, 0.09047263860702515, -0.05271327868103981, -0.2622397541999817, 0.03682216629385948, 0.05513067543506622, -0.007422737777233124, 0.14252057671546936, 0.07580838352441788, -0.023790201172232628, 0.11380083113908768, -0.02966974675655365, -0.202972412109375, 0.03197307139635086, 0.0615268237888813, -0.09518525749444962, 0.11083168536424637, 0.04628797993063927, 0.08698221296072006, 0.022143812850117683, -0.07331052422523499, -0.120787613093853, 0.04532235115766525, 0.045263588428497314, -0.06358368694782257, 0.06480421870946884, 0.08820623904466629, -0.1065920814871788, 0.1416475921869278, 0.07373794168233871, -0.01588049717247486, 0.07824484258890152, -0.17789237201213837, -0.03725104406476021, -0.036331940442323685, 0.007741300854831934, 0.07058489322662354, 0.0750737413764, -0.019116664305329323, 0.1746976524591446, -0.06598041951656342, 0.09777773916721344, 0.17528840899467468, -0.2887236773967743, -0.018040433526039124, 0.0492081381380558, 0.0887371376156807, 0.04900359362363815, -0.024227341637015343, 0.08339477330446243, 0.06365471333265305, 0.01686069741845131, 0.014271941967308521, -0.06960906833410263, -0.09346919506788254, 0.03645368665456772, -0.06932124495506287, -0.05699722096323967, 0.22001419961452484, -0.0334535576403141, 0.04674676060676575, -0.03953840583562851, -0.09316058456897736, -0.028927378356456757, -0.027232296764850616, -0.00907184462994337, -0.05413005128502846, 0.08754174411296844, -0.015151693485677242, -0.06331931799650192, -0.1435878872871399, -0.012912428006529808, -0.15805892646312714, 0.13816505670547485, 0.004333257209509611, 0.04586424678564072, -0.22094038128852844, 0.1012546569108963, 0.022817784920334816, -0.08995530754327774, 0.04930093511939049, -0.09425957500934601, 0.0717538446187973, 0.0007676240638829768, -0.04885277524590492, -0.02944779396057129, 0.08848895877599716, 0.14690880477428436, -0.02718975953757763, 0.005980455316603184, -0.01338018849492073, 0.0733228251338005, 0.059399042278528214, 0.028748195618391037, -0.006081049330532551, -0.052236080169677734, 0.05618719011545181, -0.1417204737663269, -0.010511515662074089, -0.07227712869644165, -0.10605388879776001, -0.04232237488031387, 0.03443120792508125, 0.060671232640743256, 0.042230576276779175, 0.11220116913318634, -0.04771716892719269, -0.01857093721628189, 0.05281376466155052, -0.03979083523154259, -0.008994937874376774, -0.01990325190126896, 0.018122754991054535, 0.13074275851249695, -0.019943278282880783, 0.03407962992787361, -0.10256942361593246, 0.05431444197893143, -0.10281401127576828, -0.01971535198390484, -0.014149561524391174, -0.04367954283952713, 0.031883664429187775, -0.12165860831737518, 0.016123656183481216, -0.16833168268203735, -0.14714312553405762, 0.002859292319044471, -0.016588665544986725, -0.017911825329065323, -0.07954888790845871, -0.04400517791509628, -0.02466505579650402, 0.06924423575401306, -0.04276731237769127, -0.00935916043817997, -0.05846982076764107, 0.11090090870857239, -0.05349889397621155, 0.07203050702810287, -0.1194647029042244, 0.0557217076420784, -0.14930842816829681, -0.013004516251385212, -0.04842504858970642, 0.07119924575090408, 0.015398351475596428, 0.1813964694738388, -0.006925920024514198, -0.003623353084549308, -0.09382472932338715, 0.06455672532320023, -0.02733452245593071, 0.24096953868865967, -0.0756828561425209, -0.14226967096328735, 0.21630549430847168, -0.06334739923477173, -0.14993034303188324, 0.1314547061920166, 0.01843975856900215, 0.08251222223043442, 0.10434340685606003, 0.19023460149765015, 0.01808990351855755, -0.007808534894138575, 0.054424818605184555, 0.07603957504034042, -0.1683882623910904, -0.033340878784656525, 0.0012923459289595485, -0.00014291972911451012, -0.1366809904575348, 0.04632483050227165, 0.1230006217956543, 0.09730340540409088, -0.07249721139669418, -0.018487868830561638, -0.030607668682932854, 0.0016078021144494414, 0.04144361615180969, -0.0005212334799580276, 0.09951234608888626, -0.1033509373664856, -0.04366454482078552, -0.06751791387796402, -0.002970147645100951, 0.011176802217960358, 0.03924661502242088, -0.04455869272351265, 0.09700342267751694, -0.007412149105221033, 0.0545678474009037, -0.13708296418190002, -0.07981666922569275, -0.016090448945760727, 0.1597585678100586, 0.040224816650152206, 0.04663374274969101, 0.0566885769367218, -0.05624469742178917, -0.015493324026465416, -0.010199432261288166, 0.16243304312229156, -0.04404180869460106, -0.07694169133901596, -0.07860849797725677, 0.10474636405706406, -0.06389671564102173, 0.026263169944286346, -0.051667314022779465, 0.024654213339090347, 0.04686986654996872, 0.1110762283205986, 0.004046999383717775, 0.026442723348736763, -0.012835992500185966, -0.007690808270126581, -0.07657550275325775, -0.01617686077952385, 0.1077079176902771, -0.0017721779877319932, -0.06809886544942856, 0.2437063455581665, -0.16884316504001617, 0.21163912117481232, 0.20976658165454865, -0.2492678016424179, -0.02882898785173893, -0.04848965257406235, -0.04766342043876648, -0.0012878701090812683, 0.06041788309812546, -0.034700244665145874, 0.09027024358510971, -0.03288675472140312, 0.16564396023750305, -0.051203593611717224, -0.07646744698286057, 0.019007064402103424, -0.05823178589344025, -0.05114857107400894, 0.07018019258975983, 0.08213616907596588, -0.1630844622850418, 0.18756183981895447, 0.21879082918167114, 0.06839460134506226, 0.2044064849615097, 0.00858453568071127, -0.010656360536813736, 0.07200875878334045, -0.04608747735619545, -0.03843220695853233, -0.06601633131504059, -0.15238076448440552, -0.03009703755378723, 0.06625645607709885, 0.030863380059599876, 0.09900964051485062, -0.09019728004932404, -0.08104760944843292, -0.017665131017565727, 0.004776675254106522, 0.00156646769028157, 0.11991100758314133, 0.03676433861255646, 0.13820022344589233, -0.01955524832010269, 0.022415857762098312, 0.08040772378444672, 0.016582515090703964, -0.10843544453382492, 0.16101348400115967, -0.13329310715198517, -0.3788211941719055, -0.14546175301074982, -0.13134250044822693, -0.020925991237163544, 0.03777816519141197, 0.1120775043964386, -0.1329103261232376, 0.005511005409061909, -0.007893978618085384, 0.10391844809055328, -0.08707519620656967, 0.039245378226041794, -0.07586963474750519, 0.0314689576625824, -0.060405436903238297, -0.07552991807460785, -0.03722400963306427, -0.028465405106544495, -0.09132689982652664, 0.16675986349582672, -0.11130212247371674, 0.06035055220127106, 0.16001324355602264, 0.021197395399212837, 0.03523072600364685, -0.05174810439348221, 0.18330632150173187, -0.112345851957798, 0.020098978653550148, 0.15624848008155823, -0.013005592860281467, 0.08254575729370117, 0.08188403397798538, -0.013132697902619839, -0.10316278785467148, 0.05240294709801674, 0.001463406952098012, -0.10209372639656067, -0.1950312703847885, -0.10119245946407318, -0.08230090886354446, 0.15922248363494873, 0.06361804902553558, 0.058937788009643555, 0.17968137562274933, 0.07578518986701965, -0.038606274873018265, -0.00038743947516195476, -0.00239798822440207, 0.08808282762765884, 0.13635766506195068, -0.01442645862698555, 0.1225903332233429, -0.04975935071706772, -0.10913994163274765, 0.12899059057235718, 0.01750512234866619, 0.03937286511063576, 0.051435839384794235, 0.021011192351579666, -0.011281835846602917, 0.11866551637649536, 0.13484057784080505, 0.10447502881288528, -0.015693627297878265, -0.0293489471077919, -0.04774824157357216, -0.01359935849905014, -0.033305928111076355, 0.03640862926840782, 0.008061517030000687, -0.14140670001506805, -0.06158366799354553, -0.11537835001945496, 0.08758961409330368, 0.10668005049228668, 0.07567808032035828, -0.21108253300189972, -0.003950516227632761, 0.07933880388736725, -0.03630997985601425, -0.11126025766134262, 0.08416172116994858, 0.03095286712050438, -0.1277567446231842, 0.07218055427074432, -0.03519461303949356, 0.12458370625972748, -0.0032897875644266605, 0.09583556652069092, -0.03598680719733238, -0.027483470737934113, -0.013308011926710606, 0.09818253666162491, -0.3191508650779724, 0.1621316522359848, -0.017933005467057228, -0.0618131123483181, -0.06667962670326233, -0.02528184838593006, 0.015994107350707054, 0.07729468494653702, 0.10861869156360626, 0.021759910508990288, 0.01640525460243225, -0.07345785945653915, -0.042352862656116486, 0.038021303713321686, 0.12403716146945953, -0.06827268749475479, -0.012903391383588314, -0.04523605480790138, 0.00796645786613226, -0.017124788835644722, -0.008793274872004986, 0.006911922711879015, -0.14962191879749298, 0.05182485654950142, 0.014736213721334934, 0.07058768719434738, 0.0436982735991478, -0.014969068579375744, -0.09180716425180435, 0.18274778127670288, -0.015714606270194054, -0.07271543145179749, -0.12616917490959167, -0.05262751132249832, 0.030376195907592773, -0.05518756061792374, 0.021047864109277725, -0.06501689553260803, -0.0035362408962100744, -0.06755607575178146, -0.22007296979427338, 0.1278373897075653, -0.08437205106019974, -0.07192739844322205, -0.04912353679537773, 0.2010866105556488, -0.051223888993263245, 0.003238252131268382, 0.010222852230072021, 0.021994104608893394, -0.11474784463644028, -0.09469719231128693, 0.07112357765436172, -0.03247172012925148, 0.03123478777706623, 0.0022505864035338163, -0.04091062396764755, 0.016593176871538162, -0.06314414739608765, -0.011381587944924831, 0.27866554260253906, 0.23951324820518494, -0.040407944470644, 0.1904350072145462, 0.11012271791696548, -0.08163551241159439, -0.3069863021373749, -0.10166139155626297, -0.12140648066997528, -0.02996143139898777, -0.017288926988840103, -0.16865339875221252, 0.06477722525596619, 0.038930367678403854, 0.009261871688067913, 0.13778774440288544, -0.20730599761009216, -0.08823523670434952, 0.09138026833534241, -0.02557477355003357, 0.43079736828804016, -0.1257614940404892, -0.08959750831127167, -0.051866497844457626, -0.16516901552677155, 0.2173919379711151, -0.021592965349555016, 0.07857322692871094, -0.029561417177319527, 0.11770006269216537, 0.04697660356760025, -0.010707763023674488, 0.08040876686573029, -0.00884756539016962, 0.008373050950467587, -0.12410011142492294, -0.02768467366695404, 0.04874192550778389, 0.012378438375890255, 0.0013600040692836046, -0.09389680624008179, 0.020313434302806854, -0.15990203619003296, -0.018549781292676926, -0.11233476549386978, 0.07682323455810547, 0.025788001716136932, -0.06466120481491089, -0.003637736663222313, -0.04986237734556198, -0.015892893075942993, -0.01400828268378973, 0.1717434972524643, -0.04862768203020096, 0.19366511702537537, 0.03501616790890694, 0.11570870876312256, -0.1362973153591156, 0.06143493950366974, -0.06429426372051239, -0.07528600096702576, 0.07427702099084854, -0.1537967324256897, 0.05111055448651314, 0.09430045634508133, -0.030276626348495483, 0.05380253866314888, 0.08795086294412613, -0.003969982732087374, 0.004800081253051758, 0.15867236256599426, -0.2786487936973572, 0.01320126373320818, -0.07396841049194336, -0.06665283441543579, 0.10506758838891983, 0.06261139363050461, 0.17162823677062988, 0.011681869626045227, -0.056615445762872696, 0.01595049723982811, 0.02499506063759327, -0.04915530979633331, 0.04529924690723419, 0.008104361593723297, -0.010991688817739487, -0.13640300929546356, 0.08699746429920197, 0.0042801909148693085, -0.1531187742948532, 0.024680746719241142, 0.2155698835849762, -0.1260155886411667, -0.10237220674753189, -0.03444112092256546, 0.08444061875343323, -0.11519137024879456, 0.01753072999417782, -0.030764780938625336, -0.09109894186258316, 0.07448896765708923, 0.15248911082744598, 0.049206193536520004, 0.11775100976228714, -0.015379221178591251, -0.011753370985388756, -0.05147303268313408, -0.0317845419049263, 0.025745956227183342, 0.017857374623417854, -0.08257177472114563, 0.06648801267147064, -0.022109810262918472, 0.14559012651443481, -0.09791336953639984, -0.06602771580219269, -0.1468091756105423, -0.009785634465515614, -0.0695481076836586, -0.09207163751125336, -0.08133620768785477, -0.062133077532052994, 0.0010387726360931993, -0.03962359577417374, -0.04795864596962929, -0.0791037380695343, -0.10289866477251053, 0.009435068815946579, -0.02305566892027855, 0.03256045654416084, -0.06115729361772537, 0.007872066460549831, 0.12092912197113037, -0.028174830600619316, 0.16686207056045532, 0.1458095908164978, -0.09536580741405487, 0.10568815469741821, -0.16346460580825806, -0.08964221179485321, 0.0939340740442276, -0.01729099079966545, 0.027899714186787605, 0.11666940152645111, 0.014932696707546711, 0.04195788502693176, 0.035977672785520554, 0.06045130267739296, 0.03587699308991432, -0.11899011582136154, 0.07665140181779861, 0.009481414221227169, -0.1612047255039215, -0.06303887814283371, -0.08555969595909119, 0.030386725440621376, 0.021575886756181717, 0.12225193530321121, -0.045776769518852234, 0.0887017622590065, -0.07972796261310577, 0.027257539331912994, 0.02293219044804573, -0.181223064661026, -0.047844018787145615, -0.053065262734889984, 0.032686229795217514, 0.018960151821374893, 0.1893557906150818, 0.027213018387556076, -0.03697650134563446, 0.04549255222082138, 0.1042066365480423, 0.005313898902386427, 0.004829791374504566, 0.16259528696537018, 0.09423433989286423, -0.07654286175966263, -0.12226779758930206, 0.07556461542844772, 0.019673259928822517, -0.044067107141017914, 0.10607215762138367, -0.002448870101943612, 0.020163848996162415, 0.06910120695829391, -0.014892932027578354, 0.034322552382946014, -0.044286008924245834, -0.10698256641626358, -0.023580113425850868, 0.046367425471544266, 0.00669879000633955, 0.12847968935966492, 0.177873894572258, -0.002574790036305785, 0.025011489167809486, -0.0363602340221405, -0.024931130930781364, -0.13864666223526, -0.1558164656162262, -0.06855984032154083, -0.14875617623329163, 0.012976853176951408, -0.0915176048874855, 0.04695429280400276, 0.028682325035333633, 0.06887643784284592, -0.07052405923604965, 0.04384735971689224, 0.06974220275878906, -0.12065785378217697, 0.09397104382514954, -0.028081456199288368, 0.03704333305358887, -0.006730496883392334, -0.012833851389586926, -0.10013298690319061, 0.035936567932367325, -0.01747855544090271, 0.045271266251802444, -0.04546798765659332, 0.030429324135184288, -0.1703072488307953, -0.124412901699543, -0.04034453630447388, 0.06420420855283737, -0.06510858237743378, 0.03512151539325714, 0.019115818664431572, 0.013339218683540821, 0.03305599465966225, 0.23020225763320923, -0.03704051673412323, -0.02329315058887005, -0.042310282588005066, 0.16692522168159485, -0.014016710221767426, 0.08088304847478867, -0.03037172369658947, 0.0002500463742762804, -0.08417443931102753, 0.3385351300239563, 0.3027777075767517, -0.09020252525806427, 0.019915465265512466, -0.030905582010746002, 0.03936264291405678, 0.11892254650592804, 0.13376617431640625, 0.09784641861915588, 0.2282467782497406, -0.07217609137296677, -0.03032243251800537, -0.020507147535681725, -0.011079044081270695, -0.06650827825069427, 0.0879674032330513, 0.02507801540195942, -0.05553486570715904, -0.031693898141384125, 0.0812700018286705, -0.2327648252248764, 0.10665327310562134, -0.11289316415786743, -0.1636168211698532, -0.039189815521240234, 0.0042042857967317104, 0.08908319473266602, 0.015396242961287498, 0.11228121817111969, 0.009163780137896538, -0.07585213333368301, 0.017798418179154396, 0.028085503727197647, -0.24201616644859314, -0.008133855648338795, 0.060310713946819305, -0.12939085066318512, -0.004324504639953375, -0.027167800813913345, 0.007199867628514767, 0.059822265058755875, 0.029368450865149498, -0.04319324716925621, -0.001257759635336697, -0.010450302623212337, -0.008644461631774902, -0.011618612334132195, 0.07065588980913162, 0.046958792954683304, -0.13329142332077026, 0.06869500875473022, -0.11774353682994843, 0.033477768301963806, -0.05866728723049164, -0.015255378559231758, 0.000037100471672602, 0.03460683673620224, -0.04829782620072365, 0.07058211416006088, 0.07688362896442413, -0.015606098808348179, 0.000610517687164247, -0.0802936851978302, -0.036274004727602005, -0.019796574488282204, -0.09252054989337921, -0.08371094614267349, -0.13031646609306335, -0.11573562026023865, 0.1029667928814888, -0.02224794402718544, -0.19213621318340302, 0.03111329674720764, -0.12165344506502151, 0.045619383454322815, -0.1751558482646942, 0.11076030135154724, 0.08046020567417145, 0.01831907220184803, 0.011516088619828224, -0.02576824277639389, 0.08821021765470505, 0.11728470027446747, -0.07783648371696472, -0.08528783172369003 ]