modelId
stringlengths
4
112
sha
stringlengths
40
40
lastModified
stringlengths
24
24
tags
sequence
pipeline_tag
stringclasses
29 values
private
bool
1 class
author
stringlengths
2
38
config
null
id
stringlengths
4
112
downloads
float64
0
36.8M
likes
float64
0
712
library_name
stringclasses
17 values
__index_level_0__
int64
0
38.5k
readme
stringlengths
0
186k
scasutt/wav2vec2-large-xlsr-53_toy_train_fast_masked_augment_random_noise
5949a4f7ac9fa00d3f0ef3b589a72baeb16ff8c8
2022-04-19T19:21:38.000Z
[ "pytorch", "wav2vec2", "automatic-speech-recognition", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index" ]
automatic-speech-recognition
false
scasutt
null
scasutt/wav2vec2-large-xlsr-53_toy_train_fast_masked_augment_random_noise
0
null
transformers
37,000
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: wav2vec2-large-xlsr-53_toy_train_fast_masked_augment_random_noise results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-large-xlsr-53_toy_train_fast_masked_augment_random_noise This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3471 - Wer: 0.4048 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 3.0574 | 1.68 | 500 | 3.4185 | 0.9954 | | 1.45 | 3.36 | 1000 | 0.7043 | 0.7171 | | 0.8285 | 5.03 | 1500 | 0.3874 | 0.5050 | | 0.668 | 6.71 | 2000 | 0.3321 | 0.4512 | | 0.5324 | 8.39 | 2500 | 0.3394 | 0.4321 | | 0.4775 | 10.07 | 3000 | 0.3533 | 0.4231 | | 0.4421 | 11.74 | 3500 | 0.3487 | 0.4084 | | 0.441 | 13.42 | 4000 | 0.3471 | 0.4048 | ### Framework versions - Transformers 4.18.0 - Pytorch 1.11.0+cu102 - Datasets 2.1.0 - Tokenizers 0.12.1
stevems1/bert-base-uncased-Ganapati
dcf3d4abdcd06e1f953c89a19264816061c891d5
2022-04-19T13:47:44.000Z
[ "pytorch", "tensorboard", "bert", "fill-mask", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index", "autotrain_compatible" ]
fill-mask
false
stevems1
null
stevems1/bert-base-uncased-Ganapati
0
null
transformers
37,001
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: bert-base-uncased-Ganapati results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bert-base-uncased-Ganapati This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0000 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.0 | 1.0 | 2273 | 0.0000 | | 0.0 | 2.0 | 4546 | 0.0000 | | 0.0 | 3.0 | 6819 | 0.0000 | ### Framework versions - Transformers 4.18.0 - Pytorch 1.10.0+cu111 - Datasets 2.1.0 - Tokenizers 0.12.1
tuhailong/PairSupCon-roberta-wwm-ext
986354bd345575de9f51f4a588b348b883e71bcb
2022-04-20T02:44:32.000Z
[ "pytorch", "bert", "zh", "dataset:dialogue", "transformers", "sbert" ]
null
false
tuhailong
null
tuhailong/PairSupCon-roberta-wwm-ext
0
null
transformers
37,002
--- language: zh tags: - sbert datasets: - dialogue --- # Data train data is similarity sentence data from E-commerce dialogue, about 50w sentence pairs. ## Model model created by [sentence-tansformers](https://www.sbert.net/index.html),model struct is bi-encoder model's train code by [PairSupCon](https://github.com/amazon-research/sentence-representations/tree/main/PairSupCon) ### Usage [test.py](https://github.com/TTurn/sentence-representations/edit/main/PairSupCon/test.py) #### Code train code from https://github.com/TTurn/sentence-representations/tree/main/PairSupCon
nielsr/segformer-finetuned-sidewalk-trainer
0ddc4e6a88cd865b2d026277f182367e5f5dc241
2022-04-19T13:28:26.000Z
[ "pytorch", "tensorboard", "segformer", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index" ]
null
false
nielsr
null
nielsr/segformer-finetuned-sidewalk-trainer
0
null
transformers
37,003
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: segformer-finetuned-sidewalk-trainer results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # segformer-finetuned-sidewalk-trainer This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10.0 ### Training results ### Framework versions - Transformers 4.19.0.dev0 - Pytorch 1.11.0+cu113 - Datasets 2.0.0 - Tokenizers 0.11.6
phosseini/glucose-bert-large
186c9a6f8ed11bb49ea9a836988cb97206d59405
2022-04-19T19:00:21.000Z
[ "pytorch", "bert", "fill-mask", "transformers", "autotrain_compatible" ]
fill-mask
false
phosseini
null
phosseini/glucose-bert-large
0
null
transformers
37,004
Entry not found
tau/false_large_pmi_para0_sent1_span2_True_multi_masks_7_1024_0.3_epoch1
0b5cd8bbe338236c89f31d18dfd428969e6fa7b6
2022-04-19T18:59:48.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "autotrain_compatible" ]
text2text-generation
false
tau
null
tau/false_large_pmi_para0_sent1_span2_True_multi_masks_7_1024_0.3_epoch1
0
null
transformers
37,005
Entry not found
tau/false_large_rouge_para0_sent1_span2_True_multi_masks_7_1024_0.3_epoch1
b88869217463c7c2c2c022acdcb9aa1cf49f4644
2022-04-19T19:13:52.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "autotrain_compatible" ]
text2text-generation
false
tau
null
tau/false_large_rouge_para0_sent1_span2_True_multi_masks_7_1024_0.3_epoch1
0
null
transformers
37,006
Entry not found
maveriq/lingbert-mini-1M
bf5707aa16b1c2c45358afb2be6285cbe0961ccd
2022-04-19T19:20:28.000Z
[ "pytorch", "bert", "fill-mask", "transformers", "autotrain_compatible" ]
fill-mask
false
maveriq
null
maveriq/lingbert-mini-1M
0
null
transformers
37,007
Entry not found
scasutt/wav2vec2-large-xlsr-53_toy_train_fast_masked_augment_random_noise_slow_fast
9fab457fefa9d78cb06b3bd1ceca51643113d4ff
2022-04-20T04:52:57.000Z
[ "pytorch", "wav2vec2", "automatic-speech-recognition", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index" ]
automatic-speech-recognition
false
scasutt
null
scasutt/wav2vec2-large-xlsr-53_toy_train_fast_masked_augment_random_noise_slow_fast
0
null
transformers
37,008
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: wav2vec2-large-xlsr-53_toy_train_fast_masked_augment_random_noise_slow_fast results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-large-xlsr-53_toy_train_fast_masked_augment_random_noise_slow_fast This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4007 - Wer: 0.3785 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 3.0535 | 1.2 | 500 | 3.3994 | 0.9954 | | 1.1495 | 2.4 | 1000 | 0.6490 | 0.7155 | | 0.7148 | 3.6 | 1500 | 0.3812 | 0.4690 | | 0.5305 | 4.8 | 2000 | 0.3529 | 0.4373 | | 0.475 | 6.0 | 2500 | 0.3616 | 0.4123 | | 0.3772 | 7.19 | 3000 | 0.3823 | 0.4074 | | 0.3632 | 8.39 | 3500 | 0.3665 | 0.3929 | | 0.3579 | 9.59 | 4000 | 0.3838 | 0.3917 | | 0.3386 | 10.79 | 4500 | 0.3888 | 0.3839 | | 0.3193 | 11.99 | 5000 | 0.3872 | 0.3757 | | 0.2976 | 13.19 | 5500 | 0.3986 | 0.3785 | | 0.2915 | 14.39 | 6000 | 0.4007 | 0.3785 | ### Framework versions - Transformers 4.18.0 - Pytorch 1.11.0+cu102 - Datasets 2.1.0 - Tokenizers 0.12.1
huggingtweets/billgates-kellytclements-xychelsea
ef8f0595bacf64533a01c3b1a3ff6c415d6dbdfe
2022-04-19T20:11:34.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/billgates-kellytclements-xychelsea
0
null
transformers
37,009
--- language: en thumbnail: http://www.huggingtweets.com/billgates-kellytclements-xychelsea/1650398924367/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1256728742292074496/96By_wwT_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1414439092373254147/JdS8yLGI_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1431338485504430082/zQ6S8nOo_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Chelsea E. Manning & Bill Gates & Kelly T. Clements</div> <div style="text-align: center; font-size: 14px;">@billgates-kellytclements-xychelsea</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Chelsea E. Manning & Bill Gates & Kelly T. Clements. | Data | Chelsea E. Manning | Bill Gates | Kelly T. Clements | | --- | --- | --- | --- | | Tweets downloaded | 3248 | 3213 | 1777 | | Retweets | 15 | 199 | 296 | | Short tweets | 1219 | 7 | 26 | | Tweets kept | 2014 | 3007 | 1455 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/37pv1ayu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @billgates-kellytclements-xychelsea's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2e303z5q) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2e303z5q/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/billgates-kellytclements-xychelsea') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
proseph/ctrlv-wav2vec2-tokenizer
ad6a9da2c4e759ae266dfb57a3c159b87363eb42
2022-04-20T03:40:35.000Z
[ "pytorch", "tensorboard", "wav2vec2", "automatic-speech-recognition", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index" ]
automatic-speech-recognition
false
proseph
null
proseph/ctrlv-wav2vec2-tokenizer
0
null
transformers
37,010
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: ctrlv-wav2vec2-tokenizer results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # ctrlv-wav2vec2-tokenizer This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3967 - Wer: 0.3138 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 3.4359 | 3.45 | 500 | 1.3595 | 0.9159 | | 0.5692 | 6.9 | 1000 | 0.4332 | 0.4036 | | 0.2198 | 10.34 | 1500 | 0.4074 | 0.3678 | | 0.1314 | 13.79 | 2000 | 0.3480 | 0.3409 | | 0.0929 | 17.24 | 2500 | 0.3714 | 0.3346 | | 0.0692 | 20.69 | 3000 | 0.3977 | 0.3224 | | 0.0542 | 24.14 | 3500 | 0.4068 | 0.3187 | | 0.0422 | 27.59 | 4000 | 0.3967 | 0.3138 | ### Framework versions - Transformers 4.11.3 - Pytorch 1.10.0+cu111 - Datasets 1.18.3 - Tokenizers 0.10.3
waynehills/Waynehills_mT5_Mulang
4534d8bf839325b63d90b905c4b952f76a5ac563
2022-04-21T04:12:17.000Z
[ "pytorch", "mt5", "text2text-generation", "transformers", "autotrain_compatible" ]
text2text-generation
false
waynehills
null
waynehills/Waynehills_mT5_Mulang
0
null
transformers
37,011
Entry not found
huggingtweets/elonmusk-iamsrk
2d303af1b3a526d47c6ad370a6be47406bdd6def
2022-04-20T04:58:07.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/elonmusk-iamsrk
0
null
transformers
37,012
--- language: en thumbnail: http://www.huggingtweets.com/elonmusk-iamsrk/1650430682800/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1503591435324563456/foUrqiEw_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1318511011117199362/htNsviXp_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Elon Musk & Shah Rukh Khan</div> <div style="text-align: center; font-size: 14px;">@elonmusk-iamsrk</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Elon Musk & Shah Rukh Khan. | Data | Elon Musk | Shah Rukh Khan | | --- | --- | --- | | Tweets downloaded | 221 | 3212 | | Retweets | 14 | 56 | | Short tweets | 69 | 278 | | Tweets kept | 138 | 2878 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/39qg1l4s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @elonmusk-iamsrk's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/840j96ek) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/840j96ek/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/elonmusk-iamsrk') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
obokkkk/wav2vec2-base-timit-demo-colab
5a63ee31a1e6a01b97ce88e6334315d40b2798be
2022-04-21T09:23:05.000Z
[ "pytorch", "tensorboard", "wav2vec2", "automatic-speech-recognition", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index" ]
automatic-speech-recognition
false
obokkkk
null
obokkkk/wav2vec2-base-timit-demo-colab
0
null
transformers
37,013
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: wav2vec2-base-timit-demo-colab results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-base-timit-demo-colab This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4779 - Wer: 0.3468 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 3.4408 | 4.0 | 500 | 1.2302 | 0.9116 | | 0.561 | 8.0 | 1000 | 0.4809 | 0.4320 | | 0.2091 | 12.0 | 1500 | 0.4285 | 0.3880 | | 0.1221 | 16.0 | 2000 | 0.4448 | 0.3665 | | 0.0858 | 20.0 | 2500 | 0.4622 | 0.3585 | | 0.0597 | 24.0 | 3000 | 0.4621 | 0.3517 | | 0.0453 | 28.0 | 3500 | 0.4779 | 0.3468 | ### Framework versions - Transformers 4.11.3 - Pytorch 1.10.0+cu111 - Datasets 1.18.3 - Tokenizers 0.10.3
npleshkanov/adapter_labse_intent_classifier
3842ffb0110f57adba6dae3cf338cd6ab13c3469
2022-04-20T09:52:52.000Z
[ "pytorch", "tensorboard", "bert", "transformers" ]
null
false
npleshkanov
null
npleshkanov/adapter_labse_intent_classifier
0
null
transformers
37,014
Entry not found
masakhane/afrimbart_wol_fr_news
19a1183c5029c107962bad3c0284d3d0951583a2
2022-04-20T13:52:41.000Z
[ "pytorch", "mbart", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/afrimbart_wol_fr_news
0
null
transformers
37,015
--- license: afl-3.0 ---
masakhane/afrimbart_fr_wol_news
b4e4d676d987465f6e4eb7cbebc7e84ad332c960
2022-04-20T13:52:44.000Z
[ "pytorch", "mbart", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/afrimbart_fr_wol_news
0
null
transformers
37,016
--- license: afl-3.0 ---
masakhane/afrimt5_wol_fr_news
700dcb6ea1923af048926f3b4adc2e6726faee70
2022-04-20T13:52:48.000Z
[ "pytorch", "mt5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/afrimt5_wol_fr_news
0
null
transformers
37,017
--- license: afl-3.0 ---
masakhane/afrimt5_fr_wol_news
5afbdb9eb327b7009aca2d65fa2d16065e509f11
2022-04-20T13:52:52.000Z
[ "pytorch", "mt5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/afrimt5_fr_wol_news
0
null
transformers
37,018
--- license: afl-3.0 ---
masakhane/afribyt5_wol_fr_news
c395035a83d563630cdc87fc759ec46e11a8a8d0
2022-04-20T15:07:52.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/afribyt5_wol_fr_news
0
null
transformers
37,019
--- license: afl-3.0 ---
masakhane/afribyt5_fr_wol_news
c26279e4dff10710d4cc30c5de5c9bf400d1ab35
2022-04-20T15:08:03.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/afribyt5_fr_wol_news
0
null
transformers
37,020
--- license: afl-3.0 ---
masakhane/byt5_wol_fr_news
516b1ccf80faee0103751e007bf34a0c11a75728
2022-04-20T15:07:55.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/byt5_wol_fr_news
0
null
transformers
37,021
--- license: afl-3.0 ---
masakhane/byt5_fr_wol_news
e8ddd544849103ede2d293d8b10ab822343e0164
2022-04-20T15:07:59.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/byt5_fr_wol_news
0
null
transformers
37,022
--- license: afl-3.0 ---
masakhane/mt5_fr_wol_news
6b0a2c366cc2b803ceff8e8ba1cfab76609758ec
2022-04-20T16:19:43.000Z
[ "pytorch", "mt5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/mt5_fr_wol_news
0
null
transformers
37,023
--- license: afl-3.0 ---
masakhane/mt5_wol_fr_news
11c810d667fb3f28aa7da7dc10359728e7ac7db2
2022-04-20T16:19:28.000Z
[ "pytorch", "mt5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/mt5_wol_fr_news
0
null
transformers
37,024
--- license: afl-3.0 ---
masakhane/mbart50_wol_fr_news
7a094fdd844d2313e54a06bf98aef31440884005
2022-04-20T16:19:39.000Z
[ "pytorch", "mbart", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/mbart50_wol_fr_news
0
null
transformers
37,025
--- license: afl-3.0 ---
masakhane/mbart50_fr_wol_news
a7832d14b32b2af8dd818a33105b2994ea263b0c
2022-04-20T16:19:22.000Z
[ "pytorch", "mbart", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/mbart50_fr_wol_news
0
null
transformers
37,026
--- license: afl-3.0 ---
masakhane/m2m100_418M_fr_wol_news
6a16428266a82a4babb3485fc5236d11c5f543ed
2022-04-20T17:34:38.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_fr_wol_news
0
null
transformers
37,027
--- license: afl-3.0 ---
masakhane/m2m100_418M_wol_fr_news
6a840a39b5f20167d2abefa6d0ed1fd48acbe26c
2022-04-20T17:34:48.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_wol_fr_news
0
null
transformers
37,028
--- license: afl-3.0 ---
masakhane/m2m100_418M_wol_fr_rel_news
b0a6d312a9d47ae359264f0d5d67918da2280795
2022-04-20T17:34:53.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_wol_fr_rel_news
0
null
transformers
37,029
--- license: afl-3.0 ---
masakhane/m2m100_418M_fr_wol_rel_news_ft
728c510a0119d79e3724538a25d6f1bec7f4d41e
2022-04-20T19:20:09.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_fr_wol_rel_news_ft
0
null
transformers
37,030
--- license: afl-3.0 ---
masakhane/m2m100_418M_wol_fr_rel_ft
73afafaee1f9eac624ebd7ef51a167064de3c6fd
2022-04-20T18:36:05.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_wol_fr_rel_ft
0
null
transformers
37,031
--- license: afl-3.0 ---
masakhane/m2m100_418M_fr_wol_rel_ft
ee6f7c7b7890979631dc57b33801c01433a531e8
2022-04-20T18:36:18.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_fr_wol_rel_ft
0
null
transformers
37,032
--- license: afl-3.0 ---
masakhane/m2m100_418M_fr_wol_rel
b1767f2f89a6f2c55d9cdf7858fc77c16a1f1d8c
2022-04-20T19:20:13.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_fr_wol_rel
0
null
transformers
37,033
--- license: afl-3.0 ---
jesperjmb/MergeIntrosNSP
13259bae8aba94d0fc206dfc40d02bb34c9f1bd9
2022-05-19T08:04:20.000Z
[ "pytorch", "bert", "next-sentence-prediction", "transformers" ]
null
false
jesperjmb
null
jesperjmb/MergeIntrosNSP
0
null
transformers
37,034
obokkkk/wav2vec2-base-timit-demo-colab2
ff346d67262d9692ad941e0b7588df97ffccedd2
2022-04-20T19:01:52.000Z
[ "pytorch", "tensorboard", "wav2vec2", "automatic-speech-recognition", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index" ]
automatic-speech-recognition
false
obokkkk
null
obokkkk/wav2vec2-base-timit-demo-colab2
0
null
transformers
37,035
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: wav2vec2-base-timit-demo-colab2 results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-base-timit-demo-colab2 This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4805 - Wer: 0.3398 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 32 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 1000 - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:------:| | 3.4737 | 4.0 | 500 | 1.2889 | 0.9293 | | 0.5838 | 8.0 | 1000 | 0.4751 | 0.4353 | | 0.2141 | 12.0 | 1500 | 0.4809 | 0.3881 | | 0.1259 | 16.0 | 2000 | 0.4587 | 0.3683 | | 0.084 | 20.0 | 2500 | 0.4941 | 0.3601 | | 0.0582 | 24.0 | 3000 | 0.4811 | 0.3482 | | 0.0439 | 28.0 | 3500 | 0.4805 | 0.3398 | ### Framework versions - Transformers 4.11.3 - Pytorch 1.10.0+cu111 - Datasets 1.18.3 - Tokenizers 0.10.3
huggingtweets/elonmusk-nicolebehnam-punk6529
9cc3f6a2028167aa06ed7c6b654ac78b3169f26e
2022-04-20T20:38:53.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/elonmusk-nicolebehnam-punk6529
0
null
transformers
37,036
--- language: en thumbnail: http://www.huggingtweets.com/elonmusk-nicolebehnam-punk6529/1650487127903/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1503591435324563456/foUrqiEw_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1440017111531855879/A4p6F07H_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1505511419982213126/2XfmKzFp_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Elon Musk & 6529 & nic b</div> <div style="text-align: center; font-size: 14px;">@elonmusk-nicolebehnam-punk6529</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Elon Musk & 6529 & nic b. | Data | Elon Musk | 6529 | nic b | | --- | --- | --- | --- | | Tweets downloaded | 640 | 3241 | 3249 | | Retweets | 34 | 887 | 241 | | Short tweets | 201 | 390 | 1088 | | Tweets kept | 405 | 1964 | 1920 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3d9axu9g/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @elonmusk-nicolebehnam-punk6529's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/ekidqlxj) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/ekidqlxj/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/elonmusk-nicolebehnam-punk6529') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/nicolebehnam
554f486b68d25cb2c97a02a24a325cda789e8f7d
2022-04-20T21:05:47.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/nicolebehnam
0
null
transformers
37,037
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1505511419982213126/2XfmKzFp_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">nic b</div> <div style="text-align: center; font-size: 14px;">@nicolebehnam</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from nic b. | Data | nic b | | --- | --- | | Tweets downloaded | 3249 | | Retweets | 241 | | Short tweets | 1088 | | Tweets kept | 1920 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/a4rx8y3x/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @nicolebehnam's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/y6mwoo39) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/y6mwoo39/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/nicolebehnam') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/torstenvolk
1a9e57f5251bbdb763ee2bb40028b83bf303722b
2022-04-21T00:16:11.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/torstenvolk
0
null
transformers
37,038
--- language: en thumbnail: http://www.huggingtweets.com/torstenvolk/1650500124030/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1575782906/110930-ENMA-115240-web_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Torsten Volk</div> <div style="text-align: center; font-size: 14px;">@torstenvolk</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Torsten Volk. | Data | Torsten Volk | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 449 | | Short tweets | 60 | | Tweets kept | 2741 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2pgfl6jg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @torstenvolk's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1iccl44p) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1iccl44p/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/torstenvolk') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
nnn/shangpin-pre-training
05a58c3c7d69b44d50b4b31a6a4b1aa409ccac65
2022-04-21T03:10:08.000Z
[ "pytorch", "bert", "fill-mask", "transformers", "autotrain_compatible" ]
fill-mask
false
nnn
null
nnn/shangpin-pre-training
0
null
transformers
37,039
wojciechkrukar/t5-small-finetuned-xsum
1c6694683ac1cc2bf0588118ae00424baac0f3de
2022-04-21T07:23:54.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "autotrain_compatible" ]
text2text-generation
false
wojciechkrukar
null
wojciechkrukar/t5-small-finetuned-xsum
0
null
transformers
37,040
Entry not found
frozenwalker/SciFive_pubmedqa_question_generation_using_NmCo_prompt_entity
159ccde11c12e06162d73c6a58920b86e2875fba
2022-04-21T06:32:06.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "autotrain_compatible" ]
text2text-generation
false
frozenwalker
null
frozenwalker/SciFive_pubmedqa_question_generation_using_NmCo_prompt_entity
0
null
transformers
37,041
Entry not found
satpalsr/arbit-test
95b394990c327c98ccc8eeb8fbe398097df6af50
2022-04-21T08:33:16.000Z
[ "pytorch" ]
null
false
satpalsr
null
satpalsr/arbit-test
0
null
null
37,042
Entry not found
huggingtweets/route2fi
680a2b334d2e528ea01323d3c789beae7eeaa49e
2022-04-21T10:07:42.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/route2fi
0
null
transformers
37,043
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1469588644088451073/VEu0DKDG_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Route 2 FI</div> <div style="text-align: center; font-size: 14px;">@route2fi</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Route 2 FI. | Data | Route 2 FI | | --- | --- | | Tweets downloaded | 3250 | | Retweets | 1 | | Short tweets | 264 | | Tweets kept | 2985 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1gjkyb1x/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @route2fi's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3q0o96ub) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3q0o96ub/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/route2fi') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
simonnedved/bert-mlm
af805bbc8a0f430b0d387c1b9ace8ab178b38931
2022-04-21T14:21:11.000Z
[ "pytorch", "bert", "fill-mask", "transformers", "license:apache-2.0", "autotrain_compatible" ]
fill-mask
false
simonnedved
null
simonnedved/bert-mlm
0
null
transformers
37,044
--- license: apache-2.0 ---
orendar/en_he_roberta
72ba0f6a6167be1e0576c2da668342deb039e1b0
2022-04-21T16:16:09.000Z
[ "pytorch", "encoder-decoder", "text2text-generation", "transformers", "autotrain_compatible" ]
text2text-generation
false
orendar
null
orendar/en_he_roberta
0
null
transformers
37,045
Entry not found
surajnair/r3m-50
51a1ff994641fd1c6880d34a9ea0664e46d9fbbb
2022-04-21T20:32:54.000Z
[ "pytorch", "r3m", "transformers" ]
null
false
surajnair
null
surajnair/r3m-50
0
null
transformers
37,046
This model contains the pre-trained ResNet50 R3M model from the paper "R3M: A Universal Visual Representation for Robot Manipulation" (Nair et al.) The model is trained on the Ego4D dataset using time-contrastive learning, video-language alignment, and sparsity objectives. It is used for efficient downstream robotic learning.
julien-c/gpt2-from-colab
736cc7d10233a138577d734818b50a0442da59d6
2022-04-21T20:12:19.000Z
[ "pytorch", "license:apache-2.0" ]
null
false
julien-c
null
julien-c/gpt2-from-colab
0
null
null
37,047
--- license: apache-2.0 --- \nhello
masakhane/afrimbart_ibo_en_news
7887e1db10e6872c90571c11826634714ff87ba3
2022-04-22T09:40:50.000Z
[ "pytorch", "mbart", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/afrimbart_ibo_en_news
0
null
transformers
37,048
--- license: afl-3.0 ---
masakhane/afrimbart_en_ibo_news
866a42aa51076b11667618711ef7b6f746658105
2022-04-22T09:40:47.000Z
[ "pytorch", "mbart", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/afrimbart_en_ibo_news
0
null
transformers
37,049
--- license: afl-3.0 ---
masakhane/afribyt5_ibo_en_news
244cfcc77821f0840e327024c774b70fca78b236
2022-04-22T10:50:16.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/afribyt5_ibo_en_news
0
null
transformers
37,050
--- license: afl-3.0 ---
masakhane/mbart50_ibo_en_news
2259ca6f3836d22660dd05c8af922c6e0f3c1b22
2022-04-22T10:50:22.000Z
[ "pytorch", "mbart", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/mbart50_ibo_en_news
0
null
transformers
37,051
--- license: afl-3.0 ---
masakhane/mt5_ibo_en_news
5669c0487d9ebb2661c7daaa9b2c918d3dc5271d
2022-04-22T11:48:33.000Z
[ "pytorch", "mt5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/mt5_ibo_en_news
0
null
transformers
37,052
--- license: afl-3.0 ---
masakhane/mt5_en_ibo_news
56c8c5361ac008e699b15b4ae58d5af4dad5efcc
2022-04-22T11:48:39.000Z
[ "pytorch", "mt5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/mt5_en_ibo_news
0
null
transformers
37,053
--- license: afl-3.0 ---
masakhane/byt5_en_ibo_news
bfc386981c56e49f87adedde4965b7dfdc17e4c4
2022-04-22T11:48:36.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/byt5_en_ibo_news
0
null
transformers
37,054
--- license: afl-3.0 ---
masakhane/m2m100_418M_ibo_en_rel_news
a141b9bb810908ac0f137d91832cddb41ad9ba8c
2022-04-22T12:45:10.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_ibo_en_rel_news
0
null
transformers
37,055
--- license: afl-3.0 ---
masakhane/m2m100_418M_en_ibo_rel_news
cae876ed98491711623fffb1377d0b085b0a03c6
2022-04-22T12:45:12.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_en_ibo_rel_news
0
null
transformers
37,056
--- license: afl-3.0 ---
masakhane/m2m100_418M_ibo_en_rel_news_ft
71641cb8667ec9a52354b62e0360db546e8deadd
2022-04-22T13:49:20.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_ibo_en_rel_news_ft
0
null
transformers
37,057
--- license: afl-3.0 ---
masakhane/m2m100_418M_en_ibo_rel_ft
5b9518aa11d25fb06de75b1a590e28a267f62ba5
2022-04-22T13:49:27.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_en_ibo_rel_ft
0
null
transformers
37,058
--- license: afl-3.0 ---
masakhane/m2m100_418M_en_ibo_rel
e2b47f707c6f48d7d4ad9cb0dd2afa9982b37417
2022-04-22T14:45:22.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_en_ibo_rel
0
null
transformers
37,059
--- license: afl-3.0 ---
masakhane/m2m100_418M_ibo_en_rel
8a19c96129df88a7133147f0640b3fc9b264d4ac
2022-04-22T14:45:24.000Z
[ "pytorch", "m2m_100", "text2text-generation", "transformers", "license:afl-3.0", "autotrain_compatible" ]
text2text-generation
false
masakhane
null
masakhane/m2m100_418M_ibo_en_rel
0
null
transformers
37,060
--- license: afl-3.0 ---
negfir/bert_uncased_L-8_H-256_A-4wiki103
a301777c18ca6d82ab9d2880ebf19f733e17eff8
2022-04-21T21:45:44.000Z
[ "pytorch", "bert", "fill-mask", "transformers", "autotrain_compatible" ]
fill-mask
false
negfir
null
negfir/bert_uncased_L-8_H-256_A-4wiki103
0
null
transformers
37,061
Entry not found
samake/distilbert-base-uncased-finetuned-ner
bbd9576993c98cd3a29ce9d8db331dd75c6dba64
2022-04-22T06:57:56.000Z
[ "pytorch", "tensorboard", "distilbert", "token-classification", "transformers", "autotrain_compatible" ]
token-classification
false
samake
null
samake/distilbert-base-uncased-finetuned-ner
0
null
transformers
37,062
Entry not found
rajat99/Fine_Tuning_XLSR_300M_on_OpenSLR_model
cdf281c8f1bc69c8a5ccfb68456b4c3933096acc
2022-04-22T13:11:07.000Z
[ "pytorch", "tensorboard", "wav2vec2", "automatic-speech-recognition", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index" ]
automatic-speech-recognition
false
rajat99
null
rajat99/Fine_Tuning_XLSR_300M_on_OpenSLR_model
0
null
transformers
37,063
--- license: apache-2.0 tags: - generated_from_trainer model-index: - name: Fine_Tuning_XLSR_300M_on_OpenSLR_model results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Fine_Tuning_XLSR_300M_on_OpenSLR_model This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on the None dataset. It achieves the following results on the evaluation set: - Loss: 3.2669 - Wer: 1.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 30 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:---:| | 5.5102 | 23.53 | 400 | 3.2669 | 1.0 | ### Framework versions - Transformers 4.11.3 - Pytorch 1.10.0+cu111 - Datasets 1.18.3 - Tokenizers 0.10.3
huggingtweets/plsnobullywaaa
85439266a7d1de07ac474c33587c753950fc207a
2022-04-22T20:47:21.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/plsnobullywaaa
0
null
transformers
37,064
--- language: en thumbnail: http://www.huggingtweets.com/plsnobullywaaa/1650660437516/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1511292594214551557/4T_znkpc_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">clementine</div> <div style="text-align: center; font-size: 14px;">@plsnobullywaaa</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from clementine. | Data | clementine | | --- | --- | | Tweets downloaded | 774 | | Retweets | 32 | | Short tweets | 258 | | Tweets kept | 484 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/125ldexx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @plsnobullywaaa's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2whc68l3) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2whc68l3/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/plsnobullywaaa') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/proanatwink
011ac7a0809f0a1c7e99cbbe2148bb81a6bbbd94
2022-04-22T17:26:21.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/proanatwink
0
null
transformers
37,065
--- language: en thumbnail: http://www.huggingtweets.com/proanatwink/1650648376939/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1509040026625224705/B_S4MCbD_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">God is Love (((they)))/them🪲✊🏼🇺🇦🇮🇱🏳️‍⚧️</div> <div style="text-align: center; font-size: 14px;">@proanatwink</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from God is Love (((they)))/them🪲✊🏼🇺🇦🇮🇱🏳️‍⚧️. | Data | God is Love (((they)))/them🪲✊🏼🇺🇦🇮🇱🏳️‍⚧️ | | --- | --- | | Tweets downloaded | 613 | | Retweets | 120 | | Short tweets | 142 | | Tweets kept | 351 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/yp8eka3q/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @proanatwink's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3lu2xkr5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3lu2xkr5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/proanatwink') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/charlottefang77
198609d0f7e2510df6a06a51bdae5a2632999324
2022-04-23T17:49:26.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/charlottefang77
0
null
transformers
37,066
--- language: en thumbnail: http://www.huggingtweets.com/charlottefang77/1650736161071/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1509915576566620162/LShNQbfF_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">♡ Charlotte Fang 刹利</div> <div style="text-align: center; font-size: 14px;">@charlottefang77</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from ♡ Charlotte Fang 刹利. | Data | ♡ Charlotte Fang 刹利 | | --- | --- | | Tweets downloaded | 3190 | | Retweets | 1655 | | Short tweets | 381 | | Tweets kept | 1154 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2lq9iqf9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @charlottefang77's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/39i3lnlw) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/39i3lnlw/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/charlottefang77') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/miyarepostbot
27bc5a8575e63fff98e14ec9f90314ee242e5e98
2022-04-22T18:13:23.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/miyarepostbot
0
null
transformers
37,067
--- language: en thumbnail: http://www.huggingtweets.com/miyarepostbot/1650651175106/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1400304659688878088/Lbb8zMZE_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Miya</div> <div style="text-align: center; font-size: 14px;">@miyarepostbot</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Miya. | Data | Miya | | --- | --- | | Tweets downloaded | 1840 | | Retweets | 23 | | Short tweets | 214 | | Tweets kept | 1603 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2lftgxb7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @miyarepostbot's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1b87ps3a) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1b87ps3a/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/miyarepostbot') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/mimpathy
94bff4994fdbde1d4e85d463df7f2aea3f2d4a04
2022-04-22T18:39:10.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/mimpathy
0
null
transformers
37,068
--- language: en thumbnail: http://www.huggingtweets.com/mimpathy/1650652745938/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1269411300624363520/-xYW6d_6_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">𝓗𝓸𝓷𝓸𝓻</div> <div style="text-align: center; font-size: 14px;">@mimpathy</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from 𝓗𝓸𝓷𝓸𝓻. | Data | 𝓗𝓸𝓷𝓸𝓻 | | --- | --- | | Tweets downloaded | 2299 | | Retweets | 211 | | Short tweets | 331 | | Tweets kept | 1757 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/17w4ucd3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @mimpathy's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1qr7mqkc) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1qr7mqkc/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/mimpathy') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/unbridled_id
752044ed1f24421489853fc55ae2ab574ce14596
2022-04-29T20:24:49.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/unbridled_id
0
null
transformers
37,069
--- language: en thumbnail: http://www.huggingtweets.com/unbridled_id/1651263884816/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1376263696389914629/_FzhUcTW_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Sierra Armor 𝔼𝕣𝕚𝕤</div> <div style="text-align: center; font-size: 14px;">@unbridled_id</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Sierra Armor 𝔼𝕣𝕚𝕤. | Data | Sierra Armor 𝔼𝕣𝕚𝕤 | | --- | --- | | Tweets downloaded | 3146 | | Retweets | 551 | | Short tweets | 413 | | Tweets kept | 2182 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1bhxlbvg/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @unbridled_id's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/n3ccyzg2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/n3ccyzg2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/unbridled_id') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/propertyexile
e47b57bf68df6f7cd179b84ea0577f6bf4f63e66
2022-05-09T05:28:39.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/propertyexile
0
null
transformers
37,070
--- language: en thumbnail: http://www.huggingtweets.com/propertyexile/1652074114021/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1523442545153519616/mYJEJtEL_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Primo</div> <div style="text-align: center; font-size: 14px;">@propertyexile</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Primo. | Data | Primo | | --- | --- | | Tweets downloaded | 304 | | Retweets | 37 | | Short tweets | 26 | | Tweets kept | 241 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1q8zni52/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @propertyexile's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1f85w6fy) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1f85w6fy/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/propertyexile') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
negfir/bert_uncased_L-6_H-512_A-8wiki103
4d05b6492b51c8cf31750468fdfc399481a8c88b
2022-04-22T20:28:38.000Z
[ "pytorch", "bert", "fill-mask", "transformers", "autotrain_compatible" ]
fill-mask
false
negfir
null
negfir/bert_uncased_L-6_H-512_A-8wiki103
0
null
transformers
37,071
Entry not found
shahriarg/pretrained_kyw_e1
af08d81214035c4c7a628683a9e884070a6f2f7a
2022-04-22T20:51:32.000Z
[ "pytorch", "bert", "fill-mask", "transformers", "autotrain_compatible" ]
fill-mask
false
shahriarg
null
shahriarg/pretrained_kyw_e1
0
null
transformers
37,072
Entry not found
huggingtweets/newscollected
54b2074ec6d530f6c186f494af94b6cc47cd6091
2022-05-14T14:14:25.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/newscollected
0
null
transformers
37,073
--- language: en thumbnail: http://www.huggingtweets.com/newscollected/1652537660752/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1522032150358511616/83U7w6rG_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">del co</div> <div style="text-align: center; font-size: 14px;">@newscollected</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from del co. | Data | del co | | --- | --- | | Tweets downloaded | 370 | | Retweets | 30 | | Short tweets | 68 | | Tweets kept | 272 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2sfc2k02/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @newscollected's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/1zsagze5) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/1zsagze5/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/newscollected') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/angelicism010-propertyexile-wretched_worm
ab7040c642e4a10f39846a1caacff3c76c3c60a3
2022-04-23T01:52:59.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/angelicism010-propertyexile-wretched_worm
0
null
transformers
37,074
--- language: en thumbnail: https://github.com/borisdayma/huggingtweets/blob/master/img/logo.png?raw=true tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1517583783020666881/mmUj6mkI_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1383763210314997773/aIIDR23G_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1517290992361422848/E5jRRDlu_400x400.jpg&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Primo & offlineism010 & wretched worm</div> <div style="text-align: center; font-size: 14px;">@angelicism010-propertyexile-wretched_worm</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Primo & offlineism010 & wretched worm. | Data | Primo | offlineism010 | wretched worm | | --- | --- | --- | --- | | Tweets downloaded | 200 | 278 | 3234 | | Retweets | 32 | 4 | 320 | | Short tweets | 17 | 28 | 549 | | Tweets kept | 151 | 246 | 2365 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/3o7b93qp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @angelicism010-propertyexile-wretched_worm's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/30uxuf66) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/30uxuf66/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/angelicism010-propertyexile-wretched_worm') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/h0uldin
88ea59201dc0719e6ed2d6c54af377569f0c6b1e
2022-06-07T17:23:20.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/h0uldin
0
null
transformers
37,075
--- language: en thumbnail: http://www.huggingtweets.com/h0uldin/1654622595098/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1532159785692549122/Vt4uxT07_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">H</div> <div style="text-align: center; font-size: 14px;">@h0uldin</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from H. | Data | H | | --- | --- | | Tweets downloaded | 723 | | Retweets | 166 | | Short tweets | 116 | | Tweets kept | 441 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/22nta9wb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @h0uldin's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2jd5cs4g) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2jd5cs4g/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/h0uldin') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/angelicism010
a54df49013c7b147df51b78599d2217cf812e2f5
2022-04-23T23:32:13.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/angelicism010
0
null
transformers
37,076
--- language: en thumbnail: http://www.huggingtweets.com/angelicism010/1650756728850/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1383763210314997773/aIIDR23G_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">offlineism010</div> <div style="text-align: center; font-size: 14px;">@angelicism010</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from offlineism010. | Data | offlineism010 | | --- | --- | | Tweets downloaded | 278 | | Retweets | 4 | | Short tweets | 28 | | Tweets kept | 246 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/2luo02mm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @angelicism010's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3v3jaemf) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3v3jaemf/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/angelicism010') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
negfir/bert_uncased_L-12_H-768_A-12wiki103
1e877188e1b61b94f09c68f8634a817725bbec75
2022-04-23T03:02:15.000Z
[ "pytorch", "bert", "fill-mask", "transformers", "autotrain_compatible" ]
fill-mask
false
negfir
null
negfir/bert_uncased_L-12_H-768_A-12wiki103
0
null
transformers
37,077
Entry not found
negfir/bert_uncased_L-12_H-512_A-8wiki103
b565745f15a8033e21ab9299e6ebf56c005a949b
2022-04-23T06:53:21.000Z
[ "pytorch", "bert", "fill-mask", "transformers", "autotrain_compatible" ]
fill-mask
false
negfir
null
negfir/bert_uncased_L-12_H-512_A-8wiki103
0
null
transformers
37,078
Entry not found
ywan/unite-up
970bfcf85d82315f6f3522a1fedf76b44e0252ad
2022-04-24T03:35:43.000Z
[ "pytorch", "xlm-roberta", "fill-mask", "transformers", "metric", "quality estimation", "translation evaluation", "license:apache-2.0", "autotrain_compatible" ]
fill-mask
false
ywan
null
ywan/unite-up
0
null
transformers
37,079
--- license: apache-2.0 tags: - metric - quality estimation - translation evaluation --- This model is the English-targeted version of "UniTE: Unified Translation Evaluation".
ywan/unite-mup
540eddef1fbf341a861c9afccb59ece5ade118b7
2022-04-24T04:06:41.000Z
[ "pytorch", "xlm-roberta", "fill-mask", "transformers", "metric", "quality estimation", "translation evaluation", "license:apache-2.0", "autotrain_compatible" ]
fill-mask
false
ywan
null
ywan/unite-mup
0
null
transformers
37,080
--- license: apache-2.0 tags: - metric - quality estimation - translation evaluation --- This model is the multilingual version of "UniTE: Unified Translation Evaluation".
huggingtweets/newscollected-nickmullensgf
66f4c7e076a68a86a37e62851942f11c05422ac4
2022-05-12T13:41:10.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/newscollected-nickmullensgf
0
null
transformers
37,081
--- language: en thumbnail: http://www.huggingtweets.com/newscollected-nickmullensgf/1652362865457/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1522032150358511616/83U7w6rG_400x400.jpg&#39;)"> </div> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1469950344918671364/-037cCwh_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI CYBORG 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">del co & kayla</div> <div style="text-align: center; font-size: 14px;">@newscollected-nickmullensgf</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from del co & kayla. | Data | del co | kayla | | --- | --- | --- | | Tweets downloaded | 366 | 3215 | | Retweets | 30 | 946 | | Short tweets | 67 | 362 | | Tweets kept | 269 | 1907 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/nqg16qms/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @newscollected-nickmullensgf's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3jf63jpr) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3jf63jpr/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/newscollected-nickmullensgf') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/dnlklr
b1441d7c8fe10cdf41bb5a1ed580ee0be9c9f432
2022-04-23T18:02:48.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/dnlklr
0
null
transformers
37,082
--- language: en thumbnail: http://www.huggingtweets.com/dnlklr/1650736963681/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1485855322895880192/6tnb9u8H_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Daniel Keller</div> <div style="text-align: center; font-size: 14px;">@dnlklr</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Daniel Keller. | Data | Daniel Keller | | --- | --- | | Tweets downloaded | 3229 | | Retweets | 85 | | Short tweets | 555 | | Tweets kept | 2589 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/gzfhywi9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @dnlklr's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/2pz5v2py) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/2pz5v2py/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/dnlklr') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/c8ohe2cqqe092cq
32a2b1bebba00e2f2fc4bea85027dbf1dc1a368b
2022-05-24T21:29:49.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/c8ohe2cqqe092cq
0
null
transformers
37,083
--- language: en thumbnail: http://www.huggingtweets.com/c8ohe2cqqe092cq/1653427783549/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1519998754425872385/VoEOP0Xg_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">A kind Face</div> <div style="text-align: center; font-size: 14px;">@c8ohe2cqqe092cq</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from A kind Face. | Data | A kind Face | | --- | --- | | Tweets downloaded | 3242 | | Retweets | 189 | | Short tweets | 1151 | | Tweets kept | 1902 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1czak1k4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @c8ohe2cqqe092cq's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/35wrrmdp) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/35wrrmdp/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/c8ohe2cqqe092cq') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
smeoni/nbme-deberta-v2-xlarge
f3e4943743c58dfb9d41088b328ea959343ea591
2022-04-24T17:56:24.000Z
[ "pytorch", "tensorboard", "deberta-v2", "fill-mask", "transformers", "generated_from_trainer", "license:mit", "model-index", "autotrain_compatible" ]
fill-mask
false
smeoni
null
smeoni/nbme-deberta-v2-xlarge
0
null
transformers
37,084
--- license: mit tags: - generated_from_trainer model-index: - name: nbme-deberta-v2-xlarge results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nbme-deberta-v2-xlarge This model is a fine-tuned version of [microsoft/deberta-v2-xlarge](https://huggingface.co/microsoft/deberta-v2-xlarge) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 6.5986 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 6.5771 | 1.0 | 1847 | 6.6380 | | 6.4068 | 2.0 | 3694 | 6.6034 | | 6.3597 | 3.0 | 5541 | 6.5986 | ### Framework versions - Transformers 4.19.0.dev0 - Pytorch 1.11.0 - Datasets 2.1.0 - Tokenizers 0.12.1
jackh1995/base
fbf461152ffe35223eeb10ff3ae7bef7ebe6cf04
2022-04-24T07:51:13.000Z
[ "pytorch", "bert", "question-answering", "transformers", "autotrain_compatible" ]
question-answering
false
jackh1995
null
jackh1995/base
0
null
transformers
37,085
Entry not found
jackh1995/albert-base
6da75bb41440f982a9324cc4c0529d5d9769a709
2022-04-24T07:57:41.000Z
[ "pytorch", "bert", "question-answering", "transformers", "autotrain_compatible" ]
question-answering
false
jackh1995
null
jackh1995/albert-base
0
null
transformers
37,086
Entry not found
smeoni/nbme-electra-large-generator
1de42bb009358a9c3485e88e04ac4e6fe078e027
2022-04-24T11:08:43.000Z
[ "pytorch", "tensorboard", "electra", "text-generation", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index" ]
text-generation
false
smeoni
null
smeoni/nbme-electra-large-generator
0
null
transformers
37,087
--- license: apache-2.0 tags: - generated_from_trainer metrics: - accuracy model-index: - name: nbme-electra-large-generator results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # nbme-electra-large-generator This model is a fine-tuned version of [google/electra-large-generator](https://huggingface.co/google/electra-large-generator) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0122 - Accuracy: 0.9977 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 195 | 0.1125 | 0.9789 | | No log | 2.0 | 390 | 0.0141 | 0.9973 | | 0.6233 | 3.0 | 585 | 0.0122 | 0.9977 | ### Framework versions - Transformers 4.19.0.dev0 - Pytorch 1.11.0 - Datasets 2.1.0 - Tokenizers 0.12.1
scasutt/wav2vec2-base_toy_train_double_data
d8b94d96a67103d942d0077ef5762dfe0478f889
2022-04-24T15:55:54.000Z
[ "pytorch", "wav2vec2", "automatic-speech-recognition", "transformers" ]
automatic-speech-recognition
false
scasutt
null
scasutt/wav2vec2-base_toy_train_double_data
0
null
transformers
37,088
Entry not found
macavaney/monot5-base-msmarco-sim5
677bc221b85059e54b7f9b443ab63e182c825d4f
2022-04-24T15:29:15.000Z
[ "pytorch", "t5", "text2text-generation", "transformers", "autotrain_compatible" ]
text2text-generation
false
macavaney
null
macavaney/monot5-base-msmarco-sim5
0
null
transformers
37,089
Entry not found
huggingtweets/plasma_node
fab7b1e5eda3cc023d815181a57d5becf9525cbd
2022-06-09T09:49:38.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/plasma_node
0
null
transformers
37,090
--- language: en thumbnail: http://www.huggingtweets.com/plasma_node/1654768173539/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1448820786395975694/619AxWvJ_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Plasmanode</div> <div style="text-align: center; font-size: 14px;">@plasma_node</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Plasmanode. | Data | Plasmanode | | --- | --- | | Tweets downloaded | 3242 | | Retweets | 573 | | Short tweets | 339 | | Tweets kept | 2330 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/21cfw258/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @plasma_node's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/s5kag6o2) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/s5kag6o2/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/plasma_node') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
ridhamrudhar/wav2vec2-common_voice-pa-In-demo
956e9c9892d3720fee8e83954c05d329e57665a7
2022-04-26T08:16:23.000Z
[ "pytorch", "wav2vec2", "automatic-speech-recognition", "transformers" ]
automatic-speech-recognition
false
ridhamrudhar
null
ridhamrudhar/wav2vec2-common_voice-pa-In-demo
0
null
transformers
37,091
Entry not found
dbmdz/flair-hipe-2022-ajmc-en
5aa03394b9fb7a325b74e6197b1e1a9e906e8a2b
2022-04-28T14:31:08.000Z
[ "pytorch", "license:mit" ]
null
false
dbmdz
null
dbmdz/flair-hipe-2022-ajmc-en
0
null
null
37,092
--- license: mit ---
dbmdz/flair-hipe-2022-ajmc-fr
540b6d70a0e48fe14e7b0ae9c492d5549fad65c6
2022-04-28T14:33:13.000Z
[ "pytorch", "license:mit" ]
null
false
dbmdz
null
dbmdz/flair-hipe-2022-ajmc-fr
0
null
null
37,093
--- license: mit ---
huggingtweets/jstoone
f7061ded65c46bd0b2b11309007a30625abaedec
2022-04-25T13:31:37.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/jstoone
0
null
transformers
37,094
--- language: en thumbnail: http://www.huggingtweets.com/jstoone/1650893492572/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1233003191538790400/3OxNooXT_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">Jakob Steinn</div> <div style="text-align: center; font-size: 14px;">@jstoone</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from Jakob Steinn. | Data | Jakob Steinn | | --- | --- | | Tweets downloaded | 3204 | | Retweets | 713 | | Short tweets | 177 | | Tweets kept | 2314 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1j98493p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @jstoone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3vtqate8) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3vtqate8/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/jstoone') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
sanchit-gandhi/xtreme_s_xlsr_2_bart_covost2_fr_en
0b6bcac3d8047ccae6ba3746584c5cb96fc5a3c4
2022-05-06T12:38:45.000Z
[ "pytorch", "tensorboard", "speech-encoder-decoder", "automatic-speech-recognition", "dataset:xtreme_s", "transformers", "generated_from_trainer", "model-index" ]
automatic-speech-recognition
false
sanchit-gandhi
null
sanchit-gandhi/xtreme_s_xlsr_2_bart_covost2_fr_en
0
null
transformers
37,095
--- tags: - generated_from_trainer datasets: - xtreme_s metrics: - bleu model-index: - name: '' results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # This model was trained from scratch on the xtreme_s dataset. It achieves the following results on the evaluation set: - Loss: 2.1356 - Bleu: 0.0000 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 3.0 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Bleu | |:-------------:|:-----:|:----:|:---------------:|:------:| | 2.6458 | 0.31 | 500 | 4.3754 | 0.0 | | 2.3505 | 0.62 | 1000 | 4.3071 | 0.0 | | 2.2152 | 0.93 | 1500 | 3.9444 | 0.0 | | 2.79 | 1.23 | 2000 | 3.2046 | 0.0000 | | 2.569 | 1.54 | 2500 | 2.6812 | 0.0000 | | 2.322 | 1.85 | 3000 | 2.4081 | 0.0000 | | 2.3435 | 2.16 | 3500 | 2.2696 | 0.0000 | | 2.2063 | 2.47 | 4000 | 2.2452 | 0.0000 | | 2.1087 | 2.78 | 4500 | 2.1356 | 0.0000 | ### Framework versions - Transformers 4.19.0.dev0 - Pytorch 1.10.2+cu113 - Datasets 2.1.1.dev0 - Tokenizers 0.11.0
xiaoGato/DialoGPT-small-villanelle
4a683fc14c8af4bb385273a5c5d1d5b33020b754
2022-04-25T17:29:43.000Z
[ "pytorch", "gpt2", "text-generation", "transformers", "conversational" ]
conversational
false
xiaoGato
null
xiaoGato/DialoGPT-small-villanelle
0
null
transformers
37,096
--- tags: - conversational --- # Killing Eve DialoGPT Model
jhonparra18/wav2vec2-large-xls-r-300m-guarani-small-wb
c6be00acfd794225602073c322762bbcf1e0a798
2022-04-27T16:40:31.000Z
[ "pytorch", "wav2vec2", "automatic-speech-recognition", "dataset:common_voice", "transformers", "generated_from_trainer", "license:apache-2.0", "model-index" ]
automatic-speech-recognition
false
jhonparra18
null
jhonparra18/wav2vec2-large-xls-r-300m-guarani-small-wb
0
null
transformers
37,097
--- license: apache-2.0 tags: - generated_from_trainer datasets: - common_voice model-index: - name: wav2vec2-large-xls-r-300m-guarani-small-wb results: [] --- <!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # wav2vec2-large-xls-r-300m-guarani-small-wb This model is a fine-tuned version of [glob-asr/wav2vec2-large-xls-r-300m-guarani-small](https://huggingface.co/glob-asr/wav2vec2-large-xls-r-300m-guarani-small) on the common_voice dataset. It achieves the following results on the evaluation set: - Loss: 0.1622 - Wer: 0.2446 - Cer: 0.0368 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 10 - num_epochs: 20 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | Cer | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:| | 0.1818 | 0.32 | 10 | 0.1196 | 0.2146 | 0.0305 | | 0.2953 | 0.65 | 20 | 0.1801 | 0.3090 | 0.0426 | | 0.2941 | 0.97 | 30 | 0.1935 | 0.3090 | 0.0420 | | 0.2786 | 1.29 | 40 | 0.1899 | 0.3305 | 0.0483 | | 0.2665 | 1.61 | 50 | 0.1716 | 0.3176 | 0.0454 | | 0.2752 | 1.94 | 60 | 0.1895 | 0.3948 | 0.0564 | | 0.2482 | 2.26 | 70 | 0.1753 | 0.3176 | 0.0449 | | 0.2486 | 2.58 | 80 | 0.1501 | 0.2747 | 0.0403 | | 0.2878 | 2.9 | 90 | 0.1890 | 0.3348 | 0.0529 | | 0.2539 | 3.23 | 100 | 0.2076 | 0.4635 | 0.0610 | | 0.2069 | 3.55 | 110 | 0.1711 | 0.3476 | 0.0466 | | 0.2262 | 3.87 | 120 | 0.1839 | 0.3605 | 0.0500 | | 0.2032 | 4.19 | 130 | 0.1724 | 0.3391 | 0.0489 | | 0.1997 | 4.52 | 140 | 0.1498 | 0.2704 | 0.0414 | | 0.2216 | 4.84 | 150 | 0.1531 | 0.3047 | 0.0472 | | 0.2294 | 5.16 | 160 | 0.1882 | 0.3176 | 0.0500 | | 0.2305 | 5.48 | 170 | 0.1799 | 0.3176 | 0.0483 | | 0.2052 | 5.81 | 180 | 0.1645 | 0.3262 | 0.0477 | | 0.2192 | 6.13 | 190 | 0.1439 | 0.2060 | 0.0339 | | 0.1844 | 6.45 | 200 | 0.1557 | 0.2918 | 0.0403 | | 0.1803 | 6.77 | 210 | 0.1664 | 0.3004 | 0.0426 | | 0.1831 | 7.1 | 220 | 0.1780 | 0.3176 | 0.0477 | | 0.1618 | 7.42 | 230 | 0.1671 | 0.2661 | 0.0437 | | 0.1528 | 7.74 | 240 | 0.2108 | 0.3176 | 0.0506 | | 0.1335 | 8.06 | 250 | 0.1677 | 0.2575 | 0.0408 | | 0.1736 | 8.39 | 260 | 0.1581 | 0.3004 | 0.0460 | | 0.1607 | 8.71 | 270 | 0.1529 | 0.3047 | 0.0403 | | 0.1451 | 9.03 | 280 | 0.1666 | 0.2747 | 0.0408 | | 0.1534 | 9.35 | 290 | 0.1722 | 0.2833 | 0.0437 | | 0.1567 | 9.68 | 300 | 0.1747 | 0.2918 | 0.0397 | | 0.1356 | 10.0 | 310 | 0.1659 | 0.2961 | 0.0443 | | 0.1248 | 10.32 | 320 | 0.1752 | 0.3348 | 0.0449 | | 0.149 | 10.65 | 330 | 0.1792 | 0.3348 | 0.0449 | | 0.1471 | 10.97 | 340 | 0.1843 | 0.3391 | 0.0460 | | 0.1564 | 11.29 | 350 | 0.2015 | 0.3433 | 0.0460 | | 0.1597 | 11.61 | 360 | 0.1798 | 0.2618 | 0.0380 | | 0.161 | 11.94 | 370 | 0.1716 | 0.2747 | 0.0374 | | 0.1481 | 12.26 | 380 | 0.1776 | 0.2747 | 0.0397 | | 0.1168 | 12.58 | 390 | 0.1900 | 0.2961 | 0.0454 | | 0.1173 | 12.9 | 400 | 0.1987 | 0.3090 | 0.0454 | | 0.1245 | 13.23 | 410 | 0.1710 | 0.2918 | 0.0408 | | 0.1118 | 13.55 | 420 | 0.1808 | 0.3047 | 0.0431 | | 0.1111 | 13.87 | 430 | 0.1893 | 0.2747 | 0.0403 | | 0.1041 | 14.19 | 440 | 0.1876 | 0.2918 | 0.0431 | | 0.1152 | 14.52 | 450 | 0.1800 | 0.2790 | 0.0408 | | 0.107 | 14.84 | 460 | 0.1717 | 0.2747 | 0.0385 | | 0.1139 | 15.16 | 470 | 0.1652 | 0.2704 | 0.0391 | | 0.0922 | 15.48 | 480 | 0.1659 | 0.2618 | 0.0391 | | 0.101 | 15.81 | 490 | 0.1610 | 0.2489 | 0.0362 | | 0.0835 | 16.13 | 500 | 0.1584 | 0.2403 | 0.0362 | | 0.1251 | 16.45 | 510 | 0.1601 | 0.2575 | 0.0380 | | 0.0888 | 16.77 | 520 | 0.1632 | 0.2661 | 0.0380 | | 0.0968 | 17.1 | 530 | 0.1674 | 0.2661 | 0.0385 | | 0.1105 | 17.42 | 540 | 0.1629 | 0.2833 | 0.0391 | | 0.0914 | 17.74 | 550 | 0.1623 | 0.3090 | 0.0408 | | 0.0843 | 18.06 | 560 | 0.1611 | 0.3004 | 0.0408 | | 0.0861 | 18.39 | 570 | 0.1583 | 0.2661 | 0.0385 | | 0.0861 | 18.71 | 580 | 0.1579 | 0.2618 | 0.0385 | | 0.0678 | 19.03 | 590 | 0.1585 | 0.2661 | 0.0374 | | 0.0934 | 19.35 | 600 | 0.1613 | 0.2489 | 0.0368 | | 0.0976 | 19.68 | 610 | 0.1617 | 0.2446 | 0.0368 | | 0.0799 | 20.0 | 620 | 0.1622 | 0.2446 | 0.0368 | ### Framework versions - Transformers 4.18.0 - Pytorch 1.11.0+cu113 - Datasets 2.1.0 - Tokenizers 0.12.1
huggingtweets/gerardoalone
e7d20371deebb8e0173bd9bf7b07ae30e39ddd79
2022-04-26T03:31:54.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/gerardoalone
0
null
transformers
37,098
--- language: en thumbnail: http://www.huggingtweets.com/gerardoalone/1650943909493/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1513716426795855876/jWAK0lo4_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">gay wedding technology</div> <div style="text-align: center; font-size: 14px;">@gerardoalone</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from gay wedding technology. | Data | gay wedding technology | | --- | --- | | Tweets downloaded | 3239 | | Retweets | 406 | | Short tweets | 737 | | Tweets kept | 2096 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/1p260sem/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @gerardoalone's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/3p1683gy) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/3p1683gy/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/gerardoalone') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)
huggingtweets/femboi_canis
4d528adb868e648a3360a8014aa313e0c0913fd9
2022-04-26T00:26:30.000Z
[ "pytorch", "gpt2", "text-generation", "en", "transformers", "huggingtweets" ]
text-generation
false
huggingtweets
null
huggingtweets/femboi_canis
0
null
transformers
37,099
--- language: en thumbnail: http://www.huggingtweets.com/femboi_canis/1650932783971/predictions.png tags: - huggingtweets widget: - text: "My dream is" --- <div class="inline-flex flex-col" style="line-height: 1.5;"> <div class="flex"> <div style="display:inherit; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;https://pbs.twimg.com/profile_images/1479992104306843648/e2XQNywk_400x400.jpg&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> <div style="display:none; margin-left: 4px; margin-right: 4px; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url(&#39;&#39;)"> </div> </div> <div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 AI BOT 🤖</div> <div style="text-align: center; font-size: 16px; font-weight: 800">🌻 Ole Grim | Femboi | Cane | It/Its | Hy/Hym 🔞</div> <div style="text-align: center; font-size: 14px;">@femboi_canis</div> </div> I was made with [huggingtweets](https://github.com/borisdayma/huggingtweets). Create your own bot based on your favorite user with [the demo](https://colab.research.google.com/github/borisdayma/huggingtweets/blob/master/huggingtweets-demo.ipynb)! ## How does it work? The model uses the following pipeline. ![pipeline](https://github.com/borisdayma/huggingtweets/blob/master/img/pipeline.png?raw=true) To understand how the model was developed, check the [W&B report](https://wandb.ai/wandb/huggingtweets/reports/HuggingTweets-Train-a-Model-to-Generate-Tweets--VmlldzoxMTY5MjI). ## Training data The model was trained on tweets from 🌻 Ole Grim | Femboi | Cane | It/Its | Hy/Hym 🔞. | Data | 🌻 Ole Grim | Femboi | Cane | It/Its | Hy/Hym 🔞 | | --- | --- | | Tweets downloaded | 3207 | | Retweets | 412 | | Short tweets | 206 | | Tweets kept | 2589 | [Explore the data](https://wandb.ai/wandb/huggingtweets/runs/27g3w5y2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline. ## Training procedure The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on @femboi_canis's tweets. Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/wandb/huggingtweets/runs/jv8wsew4) for full transparency and reproducibility. At the end of training, [the final model](https://wandb.ai/wandb/huggingtweets/runs/jv8wsew4/artifacts) is logged and versioned. ## How to use You can use this model directly with a pipeline for text generation: ```python from transformers import pipeline generator = pipeline('text-generation', model='huggingtweets/femboi_canis') generator("My dream is", num_return_sequences=5) ``` ## Limitations and bias The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias). In addition, the data present in the user's tweets further affects the text generated by the model. ## About *Built by Boris Dayma* [![Follow](https://img.shields.io/twitter/follow/borisdayma?style=social)](https://twitter.com/intent/follow?screen_name=borisdayma) For more details, visit the project repository. [![GitHub stars](https://img.shields.io/github/stars/borisdayma/huggingtweets?style=social)](https://github.com/borisdayma/huggingtweets)