sha
null | last_modified
null | library_name
stringclasses 154
values | text
stringlengths 1
900k
| metadata
stringlengths 2
348k
| pipeline_tag
stringclasses 45
values | id
stringlengths 5
122
| tags
listlengths 1
1.84k
| created_at
stringlengths 25
25
| arxiv
listlengths 0
201
| languages
listlengths 0
1.83k
| tags_str
stringlengths 17
9.34k
| text_str
stringlengths 0
389k
| text_lists
listlengths 0
722
| processed_texts
listlengths 1
723
| tokens_length
listlengths 1
723
| input_texts
listlengths 1
61
| embeddings
listlengths 768
768
|
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null |
transformers
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('https://images.genius.com/ed9a330b2539058076e0c48398599b09.1000x1000x1.jpg')">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Joni Mitchell</div>
<a href="https://genius.com/artists/joni-mitchell">
<div style="text-align: center; font-size: 14px;">@joni-mitchell</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Joni Mitchell.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/joni-mitchell).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/joni-mitchell")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1m5n59kk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Joni Mitchell's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/34saoh5x) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/34saoh5x/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/joni-mitchell')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/joni-mitchell")
model = AutoModelWithLMHead.from_pretrained("huggingartists/joni-mitchell")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/joni-mitchell"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/joni-mitchell
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/joni-mitchell",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/joni-mitchell #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Joni Mitchell</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@joni-mitchell</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Joni Mitchell.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Joni Mitchell's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kanye West</div>
<a href="https://genius.com/artists/kanye-west">
<div style="text-align: center; font-size: 14px;">@kanye-west</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Kanye West.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kanye-west).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kanye-west")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/hl7afoso/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Kanye West's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/28dw8m5v) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/28dw8m5v/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kanye-west')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kanye-west")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kanye-west")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kanye-west"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/kanye-west
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kanye-west",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kanye-west #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kanye West</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kanye-west</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Kanye West.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Kanye West's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Каста (Kasta)</div>
<a href="https://genius.com/artists/kasta">
<div style="text-align: center; font-size: 14px;">@kasta</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Каста (Kasta).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kasta).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kasta")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3k79xvbx/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Каста (Kasta)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1rphmch0) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1rphmch0/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kasta')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kasta")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kasta")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kasta"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/kasta
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kasta",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kasta #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Каста (Kasta)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kasta</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Каста (Kasta).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Каста (Kasta)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Каста (Kasta)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Каста (Kasta)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Каста (Kasta)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kehlani</div>
<a href="https://genius.com/artists/kehlani">
<div style="text-align: center; font-size: 14px;">@kehlani</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Kehlani.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kehlani).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kehlani")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3t2b2m5y/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Kehlani's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/35pweb11) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/35pweb11/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kehlani')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kehlani")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kehlani")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kehlani"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/kehlani
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kehlani",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kehlani #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kehlani</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kehlani</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Kehlani.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Kehlani's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Кипелов (Kipelov)</div>
<a href="https://genius.com/artists/kipelov">
<div style="text-align: center; font-size: 14px;">@kipelov</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Кипелов (Kipelov).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kipelov).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kipelov")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/225m5y65/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Кипелов (Kipelov)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/38es269x) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/38es269x/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kipelov')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kipelov")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kipelov")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kipelov"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/kipelov
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kipelov",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kipelov #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Кипелов (Kipelov)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kipelov</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Кипелов (Kipelov).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Кипелов (Kipelov)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кипелов (Kipelov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кипелов (Kipelov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кипелов (Kipelov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Кишлак (Kishlak)</div>
<a href="https://genius.com/artists/kishlak">
<div style="text-align: center; font-size: 14px;">@kishlak</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Кишлак (Kishlak).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kishlak).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kishlak")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2654f8ic/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Кишлак (Kishlak)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/12gu37uv) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/12gu37uv/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kishlak')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kishlak")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kishlak")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kishlak"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/kishlak
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kishlak",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kishlak #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Кишлак (Kishlak)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kishlak</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Кишлак (Kishlak).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Кишлак (Kishlak)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кишлак (Kishlak)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кишлак (Kishlak)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Кишлак (Kishlak)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">kizaru</div>
<a href="https://genius.com/artists/kizaru">
<div style="text-align: center; font-size: 14px;">@kizaru</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from kizaru.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kizaru).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kizaru")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2goru0fu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on kizaru's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1zni18k7) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1zni18k7/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kizaru')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kizaru")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kizaru")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kizaru"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/kizaru
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kizaru",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kizaru #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">kizaru</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kizaru</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from kizaru.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on kizaru's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Krechet</div>
<a href="https://genius.com/artists/krechet">
<div style="text-align: center; font-size: 14px;">@krechet</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Krechet.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/krechet).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/krechet")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1c2yk38s/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Krechet's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/39bxkroc) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/39bxkroc/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/krechet')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/krechet")
model = AutoModelWithLMHead.from_pretrained("huggingartists/krechet")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/krechet"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/krechet
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/krechet",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/krechet #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Krechet</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@krechet</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Krechet.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Krechet's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kurt Cobain</div>
<a href="https://genius.com/artists/kurt-cobain">
<div style="text-align: center; font-size: 14px;">@kurt-cobain</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Kurt Cobain.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/kurt-cobain).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/kurt-cobain")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/tjfuj6tr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Kurt Cobain's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3enopofm) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3enopofm/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/kurt-cobain')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/kurt-cobain")
model = AutoModelWithLMHead.from_pretrained("huggingartists/kurt-cobain")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/kurt-cobain"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/kurt-cobain
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/kurt-cobain",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/kurt-cobain #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Kurt Cobain</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@kurt-cobain</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Kurt Cobain.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Kurt Cobain's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lady Gaga</div>
<a href="https://genius.com/artists/lady-gaga">
<div style="text-align: center; font-size: 14px;">@lady-gaga</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lady Gaga.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lady-gaga).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lady-gaga")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/17c0d4ej/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lady Gaga's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2j7yp9qd) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2j7yp9qd/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lady-gaga')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lady-gaga")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lady-gaga")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lady-gaga"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/lady-gaga
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lady-gaga",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lady-gaga #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lady Gaga</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lady-gaga</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lady Gaga.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lady Gaga's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lazy Jay</div>
<a href="https://genius.com/artists/lazy-jay">
<div style="text-align: center; font-size: 14px;">@lazy-jay</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lazy Jay.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lazy-jay).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lazy-jay")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/tlb735a4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lazy Jay's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/36z52xfj) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/36z52xfj/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lazy-jay')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lazy-jay")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lazy-jay")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lazy-jay"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/lazy-jay
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lazy-jay",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lazy-jay #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lazy Jay</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lazy-jay</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lazy Jay.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lazy Jay's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Led Zeppelin</div>
<a href="https://genius.com/artists/led-zeppelin">
<div style="text-align: center; font-size: 14px;">@led-zeppelin</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Led Zeppelin.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/led-zeppelin).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/led-zeppelin")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/cpexpb1w/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Led Zeppelin's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/bna2epba) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/bna2epba/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/led-zeppelin')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/led-zeppelin")
model = AutoModelWithLMHead.from_pretrained("huggingartists/led-zeppelin")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/led-zeppelin"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/led-zeppelin
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/led-zeppelin",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/led-zeppelin #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Led Zeppelin</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@led-zeppelin</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Led Zeppelin.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Led Zeppelin's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Baby</div>
<a href="https://genius.com/artists/lil-baby">
<div style="text-align: center; font-size: 14px;">@lil-baby</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lil Baby.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lil-baby).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-baby")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/vueaothh/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lil Baby's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/257bod1h) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/257bod1h/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lil-baby')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lil-baby")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lil-baby")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lil-baby"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/lil-baby
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lil-baby",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lil-baby #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Baby</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lil-baby</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lil Baby.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lil Baby's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Nas X</div>
<a href="https://genius.com/artists/lil-nas-x">
<div style="text-align: center; font-size: 14px;">@lil-nas-x</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lil Nas X.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lil-nas-x).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-nas-x")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/n5s2tj7p/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lil Nas X's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/334lnf7p) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/334lnf7p/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lil-nas-x')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lil-nas-x")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lil-nas-x")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lil-nas-x"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/lil-nas-x
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lil-nas-x",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lil-nas-x #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Nas X</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lil-nas-x</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lil Nas X.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lil Nas X's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Peep</div>
<a href="https://genius.com/artists/lil-peep">
<div style="text-align: center; font-size: 14px;">@lil-peep</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lil Peep.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lil-peep).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-peep")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/39q6kspr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lil Peep's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/g0nxk974) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/g0nxk974/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lil-peep')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lil-peep")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lil-peep")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lil-peep"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/lil-peep
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lil-peep",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lil-peep #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Peep</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lil-peep</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lil Peep.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lil Peep's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Uzi Vert</div>
<a href="https://genius.com/artists/lil-uzi-vert">
<div style="text-align: center; font-size: 14px;">@lil-uzi-vert</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lil Uzi Vert.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lil-uzi-vert).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lil-uzi-vert")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/14mmkidw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lil Uzi Vert's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3s5iqd7v) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3s5iqd7v/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lil-uzi-vert')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lil-uzi-vert")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lil-uzi-vert")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lil-uzi-vert"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/lil-uzi-vert
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lil-uzi-vert",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lil-uzi-vert #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lil Uzi Vert</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lil-uzi-vert</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lil Uzi Vert.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lil Uzi Vert's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Linkin Park</div>
<a href="https://genius.com/artists/linkin-park">
<div style="text-align: center; font-size: 14px;">@linkin-park</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Linkin Park.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/linkin-park).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/linkin-park")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3mtr0u4z/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Linkin Park's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/fxn4brd6) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/fxn4brd6/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/linkin-park')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/linkin-park")
model = AutoModelWithLMHead.from_pretrained("huggingartists/linkin-park")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/linkin-park"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/linkin-park
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/linkin-park",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/linkin-park #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Linkin Park</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@linkin-park</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Linkin Park.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Linkin Park's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Little Big</div>
<a href="https://genius.com/artists/little-big">
<div style="text-align: center; font-size: 14px;">@little-big</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Little Big.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/little-big).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/little-big")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2rjstm9q/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Little Big's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/289c46fn) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/289c46fn/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/little-big')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/little-big")
model = AutoModelWithLMHead.from_pretrained("huggingartists/little-big")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/little-big"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/little-big
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/little-big",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/little-big #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Little Big</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@little-big</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Little Big.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Little Big's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Logic</div>
<a href="https://genius.com/artists/logic">
<div style="text-align: center; font-size: 14px;">@logic</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Logic.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/logic).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/logic")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2rp89nd3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Logic's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/25a9752b) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/25a9752b/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/logic')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/logic")
model = AutoModelWithLMHead.from_pretrained("huggingartists/logic")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/logic"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/logic
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/logic",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/logic #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Logic</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@logic</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Logic.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Logic's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Loud Luxury</div>
<a href="https://genius.com/artists/loud-luxury">
<div style="text-align: center; font-size: 14px;">@loud-luxury</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Loud Luxury.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/loud-luxury).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/loud-luxury")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2a6kq74a/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Loud Luxury's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2l3op3mf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2l3op3mf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/loud-luxury')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/loud-luxury")
model = AutoModelWithLMHead.from_pretrained("huggingartists/loud-luxury")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/loud-luxury"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/loud-luxury
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/loud-luxury",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/loud-luxury #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Loud Luxury</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@loud-luxury</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Loud Luxury.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Loud Luxury's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">LoveRance</div>
<a href="https://genius.com/artists/loverance">
<div style="text-align: center; font-size: 14px;">@loverance</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from LoveRance.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/loverance).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/loverance")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2cr3cjd1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on LoveRance's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/18xbgyqf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/18xbgyqf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/loverance')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/loverance")
model = AutoModelWithLMHead.from_pretrained("huggingartists/loverance")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/loverance"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/loverance
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/loverance",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/loverance #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">LoveRance</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@loverance</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from LoveRance.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on LoveRance's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">LOVV66</div>
<a href="https://genius.com/artists/lovv66">
<div style="text-align: center; font-size: 14px;">@lovv66</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from LOVV66.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lovv66).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lovv66")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1t6a2fxs/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on LOVV66's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1de08pf6) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1de08pf6/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lovv66')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lovv66")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lovv66")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lovv66"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/lovv66
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lovv66",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lovv66 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">LOVV66</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lovv66</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from LOVV66.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on LOVV66's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lumen</div>
<a href="https://genius.com/artists/lumen">
<div style="text-align: center; font-size: 14px;">@lumen</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Lumen.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lumen).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lumen")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2fkqbnvl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Lumen's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1vhfm4ch) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1vhfm4ch/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lumen')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lumen")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lumen")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lumen"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/lumen
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lumen",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lumen #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Lumen</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lumen</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Lumen.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Lumen's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ляпис Трубецкой (Lyapis Trubetskoy)</div>
<a href="https://genius.com/artists/lyapis-trubetskoy">
<div style="text-align: center; font-size: 14px;">@lyapis-trubetskoy</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Ляпис Трубецкой (Lyapis Trubetskoy).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/lyapis-trubetskoy).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/lyapis-trubetskoy")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1ycs0usm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Ляпис Трубецкой (Lyapis Trubetskoy)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/uz1xtq0k) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/uz1xtq0k/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/lyapis-trubetskoy')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/lyapis-trubetskoy")
model = AutoModelWithLMHead.from_pretrained("huggingartists/lyapis-trubetskoy")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/lyapis-trubetskoy"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/lyapis-trubetskoy
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/lyapis-trubetskoy",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/lyapis-trubetskoy #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ляпис Трубецкой (Lyapis Trubetskoy)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@lyapis-trubetskoy</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Ляпис Трубецкой (Lyapis Trubetskoy).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Ляпис Трубецкой (Lyapis Trubetskoy)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Ляпис Трубецкой (Lyapis Trubetskoy)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Ляпис Трубецкой (Lyapis Trubetskoy)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Ляпис Трубецкой (Lyapis Trubetskoy)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MACAN</div>
<a href="https://genius.com/artists/macan">
<div style="text-align: center; font-size: 14px;">@macan</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from MACAN.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/macan).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/macan")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3u3vx3xp/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on MACAN's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/23krf2tu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/23krf2tu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/macan')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/macan")
model = AutoModelWithLMHead.from_pretrained("huggingartists/macan")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/macan"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/macan
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/macan",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/macan #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MACAN</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@macan</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from MACAN.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on MACAN's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Machine Gun Kelly</div>
<a href="https://genius.com/artists/machine-gun-kelly">
<div style="text-align: center; font-size: 14px;">@machine-gun-kelly</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Machine Gun Kelly.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/machine-gun-kelly).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/machine-gun-kelly")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/33f2ce6m/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Machine Gun Kelly's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2bbn6fvb) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2bbn6fvb/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/machine-gun-kelly')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/machine-gun-kelly")
model = AutoModelWithLMHead.from_pretrained("huggingartists/machine-gun-kelly")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/machine-gun-kelly"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/machine-gun-kelly
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/machine-gun-kelly",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/machine-gun-kelly #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Machine Gun Kelly</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@machine-gun-kelly</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Machine Gun Kelly.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Machine Gun Kelly's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Madonna</div>
<a href="https://genius.com/artists/madonna">
<div style="text-align: center; font-size: 14px;">@madonna</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Madonna.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/madonna).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/madonna")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2abhif57/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Madonna's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2eok9fmu) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2eok9fmu/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/madonna')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/madonna")
model = AutoModelWithLMHead.from_pretrained("huggingartists/madonna")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/madonna"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/madonna
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/madonna",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/madonna #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Madonna</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@madonna</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Madonna.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Madonna's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Marillion</div>
<a href="https://genius.com/artists/marillion">
<div style="text-align: center; font-size: 14px;">@marillion</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Marillion.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/marillion).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/marillion")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/bajnt52i/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Marillion's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/wi2lgudb) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/wi2lgudb/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/marillion')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/marillion")
model = AutoModelWithLMHead.from_pretrained("huggingartists/marillion")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/marillion"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/marillion
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/marillion",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/marillion #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Marillion</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@marillion</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Marillion.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Marillion's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Maroon 5</div>
<a href="https://genius.com/artists/maroon-5">
<div style="text-align: center; font-size: 14px;">@maroon-5</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Maroon 5.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/maroon-5).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/maroon-5")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/38629b22/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Maroon 5's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2ylk8pym) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2ylk8pym/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/maroon-5')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/maroon-5")
model = AutoModelWithLMHead.from_pretrained("huggingartists/maroon-5")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/maroon-5"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/maroon-5
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/maroon-5",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/maroon-5 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Maroon 5</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@maroon-5</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Maroon 5.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Maroon 5's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Машина Времени (Mashina Vremeni)</div>
<a href="https://genius.com/artists/mashina-vremeni">
<div style="text-align: center; font-size: 14px;">@mashina-vremeni</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Машина Времени (Mashina Vremeni).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/mashina-vremeni).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mashina-vremeni")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3r1yxrx7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Машина Времени (Mashina Vremeni)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1cgaltpc) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1cgaltpc/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/mashina-vremeni')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/mashina-vremeni")
model = AutoModelWithLMHead.from_pretrained("huggingartists/mashina-vremeni")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/mashina-vremeni"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/mashina-vremeni
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/mashina-vremeni",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/mashina-vremeni #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Машина Времени (Mashina Vremeni)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@mashina-vremeni</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Машина Времени (Mashina Vremeni).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Машина Времени (Mashina Vremeni)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Машина Времени (Mashina Vremeni)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Машина Времени (Mashina Vremeni)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Машина Времени (Mashina Vremeni)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Mating Ritual</div>
<a href="https://genius.com/artists/mating-ritual">
<div style="text-align: center; font-size: 14px;">@mating-ritual</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Mating Ritual.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/mating-ritual).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mating-ritual")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3cljintu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Mating Ritual's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/dv1g3x3b) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/dv1g3x3b/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/mating-ritual')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/mating-ritual")
model = AutoModelWithLMHead.from_pretrained("huggingartists/mating-ritual")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/mating-ritual"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/mating-ritual
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/mating-ritual",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/mating-ritual #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Mating Ritual</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@mating-ritual</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Mating Ritual.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Mating Ritual's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Макс Корж (Max Korzh)</div>
<a href="https://genius.com/artists/max-korzh">
<div style="text-align: center; font-size: 14px;">@max-korzh</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Макс Корж (Max Korzh).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/max-korzh).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/max-korzh")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2lupo5gy/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Макс Корж (Max Korzh)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1pm64gaa) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1pm64gaa/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/max-korzh')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/max-korzh")
model = AutoModelWithLMHead.from_pretrained("huggingartists/max-korzh")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/max-korzh"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/max-korzh
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/max-korzh",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/max-korzh #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Макс Корж (Max Korzh)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@max-korzh</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Макс Корж (Max Korzh).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Макс Корж (Max Korzh)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Макс Корж (Max Korzh)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Макс Корж (Max Korzh)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Макс Корж (Max Korzh)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MAYOT</div>
<a href="https://genius.com/artists/mayot">
<div style="text-align: center; font-size: 14px;">@mayot</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from MAYOT.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/mayot).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mayot")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/lf4wcx85/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on MAYOT's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1uulibm2) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1uulibm2/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/mayot')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/mayot")
model = AutoModelWithLMHead.from_pretrained("huggingartists/mayot")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/mayot"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/mayot
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/mayot",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/mayot #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MAYOT</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@mayot</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from MAYOT.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on MAYOT's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MC Ride</div>
<a href="https://genius.com/artists/mc-ride">
<div style="text-align: center; font-size: 14px;">@mc-ride</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from MC Ride.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/mc-ride).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mc-ride")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2ar7kgj5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on MC Ride's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/299iw75q) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/299iw75q/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/mc-ride')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/mc-ride")
model = AutoModelWithLMHead.from_pretrained("huggingartists/mc-ride")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/mc-ride"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/mc-ride
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/mc-ride",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/mc-ride #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MC Ride</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@mc-ride</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from MC Ride.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on MC Ride's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Melanie Martinez</div>
<a href="https://genius.com/artists/melanie-martinez">
<div style="text-align: center; font-size: 14px;">@melanie-martinez</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Melanie Martinez.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/melanie-martinez).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/melanie-martinez")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/lb3ks0y5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Melanie Martinez's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2rvs9wvc) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2rvs9wvc/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/melanie-martinez')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/melanie-martinez")
model = AutoModelWithLMHead.from_pretrained("huggingartists/melanie-martinez")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/melanie-martinez"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/melanie-martinez
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/melanie-martinez",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/melanie-martinez #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Melanie Martinez</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@melanie-martinez</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Melanie Martinez.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Melanie Martinez's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Metallica</div>
<a href="https://genius.com/artists/metallica">
<div style="text-align: center; font-size: 14px;">@metallica</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Metallica.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/metallica).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/metallica")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/30glu695/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Metallica's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2m1o5q6p) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2m1o5q6p/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/metallica')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/metallica")
model = AutoModelWithLMHead.from_pretrained("huggingartists/metallica")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/metallica"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/metallica
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/metallica",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/metallica #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Metallica</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@metallica</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Metallica.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Metallica's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MF DOOM</div>
<a href="https://genius.com/artists/mf-doom">
<div style="text-align: center; font-size: 14px;">@mf-doom</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from MF DOOM.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/mf-doom).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mf-doom")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3lhrsfds/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on MF DOOM's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/vw48qbeh) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/vw48qbeh/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/mf-doom')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/mf-doom")
model = AutoModelWithLMHead.from_pretrained("huggingartists/mf-doom")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/mf-doom"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/mf-doom
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/mf-doom",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/mf-doom #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MF DOOM</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@mf-doom</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from MF DOOM.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on MF DOOM's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Михаил Горшенев (Mikhail Gorshenev)</div>
<a href="https://genius.com/artists/mikhail-gorshenev">
<div style="text-align: center; font-size: 14px;">@mikhail-gorshenev</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Михаил Горшенев (Mikhail Gorshenev).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/mikhail-gorshenev).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mikhail-gorshenev")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3h9endcz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Михаил Горшенев (Mikhail Gorshenev)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1kdp29bz) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1kdp29bz/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/mikhail-gorshenev')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/mikhail-gorshenev")
model = AutoModelWithLMHead.from_pretrained("huggingartists/mikhail-gorshenev")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/mikhail-gorshenev"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/mikhail-gorshenev
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/mikhail-gorshenev",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/mikhail-gorshenev #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Михаил Горшенев (Mikhail Gorshenev)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@mikhail-gorshenev</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Михаил Горшенев (Mikhail Gorshenev).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Михаил Горшенев (Mikhail Gorshenev)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Михаил Горшенев (Mikhail Gorshenev)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Михаил Горшенев (Mikhail Gorshenev)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Михаил Горшенев (Mikhail Gorshenev)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Miyagi</div>
<a href="https://genius.com/artists/miyagi">
<div style="text-align: center; font-size: 14px;">@miyagi</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Miyagi.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/miyagi).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/miyagi")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1c4sny4a/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Miyagi's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1v51pw0u) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1v51pw0u/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/miyagi')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/miyagi")
model = AutoModelWithLMHead.from_pretrained("huggingartists/miyagi")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/miyagi"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/miyagi
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/miyagi",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/miyagi #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Miyagi</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@miyagi</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Miyagi.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Miyagi's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Mnogoznaal</div>
<a href="https://genius.com/artists/mnogoznaal">
<div style="text-align: center; font-size: 14px;">@mnogoznaal</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Mnogoznaal.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/mnogoznaal).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mnogoznaal")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/21uo4oav/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Mnogoznaal's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/13v4iqfe) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/13v4iqfe/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/mnogoznaal')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/mnogoznaal")
model = AutoModelWithLMHead.from_pretrained("huggingartists/mnogoznaal")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/mnogoznaal"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/mnogoznaal
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/mnogoznaal",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/mnogoznaal #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Mnogoznaal</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@mnogoznaal</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Mnogoznaal.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Mnogoznaal's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MORGENSHTERN</div>
<a href="https://genius.com/artists/morgenshtern">
<div style="text-align: center; font-size: 14px;">@morgenshtern</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from MORGENSHTERN.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/morgenshtern).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/morgenshtern")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/lmrnk6sz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on MORGENSHTERN's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1m2jynlh) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1m2jynlh/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/morgenshtern')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/morgenshtern")
model = AutoModelWithLMHead.from_pretrained("huggingartists/morgenshtern")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/morgenshtern"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/morgenshtern
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/morgenshtern",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/morgenshtern #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">MORGENSHTERN</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@morgenshtern</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from MORGENSHTERN.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on MORGENSHTERN's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Мумий Тролль (Mumiy Troll)</div>
<a href="https://genius.com/artists/mumiy-troll">
<div style="text-align: center; font-size: 14px;">@mumiy-troll</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Мумий Тролль (Mumiy Troll).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/mumiy-troll).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/mumiy-troll")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/8o66pyeu/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Мумий Тролль (Mumiy Troll)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/32hmbbel) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/32hmbbel/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/mumiy-troll')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/mumiy-troll")
model = AutoModelWithLMHead.from_pretrained("huggingartists/mumiy-troll")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/mumiy-troll"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/mumiy-troll
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/mumiy-troll",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/mumiy-troll #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Мумий Тролль (Mumiy Troll)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@mumiy-troll</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Мумий Тролль (Mumiy Troll).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Мумий Тролль (Mumiy Troll)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Мумий Тролль (Mumiy Troll)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Мумий Тролль (Mumiy Troll)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Мумий Тролль (Mumiy Troll)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Muse</div>
<a href="https://genius.com/artists/muse">
<div style="text-align: center; font-size: 14px;">@muse</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Muse.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/muse).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/muse")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3w58rwod/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Muse's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3j03atcr) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3j03atcr/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/muse')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/muse")
model = AutoModelWithLMHead.from_pretrained("huggingartists/muse")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/muse"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/muse
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/muse",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/muse #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Muse</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@muse</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Muse.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Muse's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Нервы (Nervy)</div>
<a href="https://genius.com/artists/nervy">
<div style="text-align: center; font-size: 14px;">@nervy</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Нервы (Nervy).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/nervy).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/nervy")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/34zj7k43/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Нервы (Nervy)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2pd7k5jf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2pd7k5jf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/nervy')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/nervy")
model = AutoModelWithLMHead.from_pretrained("huggingartists/nervy")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/nervy"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/nervy
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/nervy",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/nervy #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Нервы (Nervy)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@nervy</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Нервы (Nervy).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Нервы (Nervy)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Нервы (Nervy)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Нервы (Nervy)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Нервы (Nervy)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Nirvana</div>
<a href="https://genius.com/artists/nirvana">
<div style="text-align: center; font-size: 14px;">@nirvana</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Nirvana.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/nirvana).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/nirvana")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1bj9eav1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Nirvana's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3vzztlsq) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3vzztlsq/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/nirvana')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/nirvana")
model = AutoModelWithLMHead.from_pretrained("huggingartists/nirvana")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/nirvana"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/nirvana
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/nirvana",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/nirvana #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Nirvana</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@nirvana</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Nirvana.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Nirvana's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">OBLADAET</div>
<a href="https://genius.com/artists/obladaet">
<div style="text-align: center; font-size: 14px;">@obladaet</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from OBLADAET.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/obladaet).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/obladaet")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1mtsuuwr/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on OBLADAET's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1s9epb35) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1s9epb35/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/obladaet')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/obladaet")
model = AutoModelWithLMHead.from_pretrained("huggingartists/obladaet")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/obladaet"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/obladaet
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/obladaet",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/obladaet #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">OBLADAET</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@obladaet</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from OBLADAET.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on OBLADAET's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">OG Buda</div>
<a href="https://genius.com/artists/og-buda">
<div style="text-align: center; font-size: 14px;">@og-buda</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from OG Buda.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/og-buda).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/og-buda")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2ic775kv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on OG Buda's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1g4193mx) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1g4193mx/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/og-buda')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/og-buda")
model = AutoModelWithLMHead.from_pretrained("huggingartists/og-buda")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/og-buda"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/og-buda
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/og-buda",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/og-buda #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">OG Buda</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@og-buda</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from OG Buda.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on OG Buda's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">O.T (RUS)</div>
<a href="https://genius.com/artists/ot-rus">
<div style="text-align: center; font-size: 14px;">@ot-rus</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from O.T (RUS).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/ot-rus).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ot-rus")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/35byet4r/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on O.T (RUS)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2p2tawej) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2p2tawej/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ot-rus')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/ot-rus")
model = AutoModelWithLMHead.from_pretrained("huggingartists/ot-rus")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/ot-rus"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/ot-rus
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/ot-rus",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/ot-rus #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">O.T (RUS)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@ot-rus</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from O.T (RUS).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on O.T (RUS)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on O.T (RUS)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on O.T (RUS)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on O.T (RUS)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Our Last Night</div>
<a href="https://genius.com/artists/our-last-night">
<div style="text-align: center; font-size: 14px;">@our-last-night</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Our Last Night.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/our-last-night).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/our-last-night")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/37o66f2j/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Our Last Night's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1hifralf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1hifralf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/our-last-night')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/our-last-night")
model = AutoModelWithLMHead.from_pretrained("huggingartists/our-last-night")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/our-last-night"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/our-last-night
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/our-last-night",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/our-last-night #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Our Last Night</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@our-last-night</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Our Last Night.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Our Last Night's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Oxxxymiron</div>
<a href="https://genius.com/artists/oxxxymiron">
<div style="text-align: center; font-size: 14px;">@oxxxymiron</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Oxxxymiron.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/oxxxymiron).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/oxxxymiron")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/e254c9iz/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Oxxxymiron's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1ggk9c4z) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1ggk9c4z/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/oxxxymiron')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/oxxxymiron")
model = AutoModelWithLMHead.from_pretrained("huggingartists/oxxxymiron")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/oxxxymiron"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/oxxxymiron
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/oxxxymiron",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/oxxxymiron #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Oxxxymiron</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@oxxxymiron</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Oxxxymiron.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Oxxxymiron's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Peter, Paul and Mary</div>
<a href="https://genius.com/artists/peter-paul-and-mary">
<div style="text-align: center; font-size: 14px;">@peter-paul-and-mary</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Peter, Paul and Mary.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/peter-paul-and-mary).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/peter-paul-and-mary")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/svwa6bev/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Peter, Paul and Mary's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1s4mkr9x) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1s4mkr9x/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/peter-paul-and-mary')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/peter-paul-and-mary")
model = AutoModelWithLMHead.from_pretrained("huggingartists/peter-paul-and-mary")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/peter-paul-and-mary"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/peter-paul-and-mary
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/peter-paul-and-mary",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/peter-paul-and-mary #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Peter, Paul and Mary</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@peter-paul-and-mary</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Peter, Paul and Mary.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Peter, Paul and Mary's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">PHARAOH</div>
<a href="https://genius.com/artists/pharaoh">
<div style="text-align: center; font-size: 14px;">@pharaoh</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from PHARAOH.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/pharaoh).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/pharaoh")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/jefxst5w/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on PHARAOH's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1fqlqxjo) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1fqlqxjo/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/pharaoh')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/pharaoh")
model = AutoModelWithLMHead.from_pretrained("huggingartists/pharaoh")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/pharaoh"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/pharaoh
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/pharaoh",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/pharaoh #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">PHARAOH</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@pharaoh</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from PHARAOH.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on PHARAOH's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Phish</div>
<a href="https://genius.com/artists/phish">
<div style="text-align: center; font-size: 14px;">@phish</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Phish.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/phish).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/phish")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/22sghxz4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Phish's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/340yi6e5) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/340yi6e5/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/phish')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/phish")
model = AutoModelWithLMHead.from_pretrained("huggingartists/phish")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/phish"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/phish
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/phish",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/phish #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Phish</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@phish</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Phish.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Phish's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Pink Floyd</div>
<a href="https://genius.com/artists/pink-floyd">
<div style="text-align: center; font-size: 14px;">@pink-floyd</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Pink Floyd.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/pink-floyd).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/pink-floyd")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3j9osgks/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Pink Floyd's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1wlqpngf) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1wlqpngf/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/pink-floyd')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/pink-floyd")
model = AutoModelWithLMHead.from_pretrained("huggingartists/pink-floyd")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/pink-floyd"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/pink-floyd
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/pink-floyd",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/pink-floyd #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Pink Floyd</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@pink-floyd</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Pink Floyd.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Pink Floyd's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Placebo</div>
<a href="https://genius.com/artists/placebo">
<div style="text-align: center; font-size: 14px;">@placebo</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Placebo.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/placebo).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/placebo")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3jfcdfc1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Placebo's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/jx3r5x9o) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/jx3r5x9o/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/placebo')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/placebo")
model = AutoModelWithLMHead.from_pretrained("huggingartists/placebo")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/placebo"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/placebo
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/placebo",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/placebo #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Placebo</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@placebo</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Placebo.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Placebo's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Платина (Platina)</div>
<a href="https://genius.com/artists/platina">
<div style="text-align: center; font-size: 14px;">@platina</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Платина (Platina).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/platina).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/platina")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2ih365j7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Платина (Platina)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1quasiz0) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1quasiz0/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/platina')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/platina")
model = AutoModelWithLMHead.from_pretrained("huggingartists/platina")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/platina"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/platina
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/platina",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/platina #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Платина (Platina)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@platina</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Платина (Platina).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Платина (Platina)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Платина (Platina)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Платина (Platina)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Платина (Platina)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Post Malone</div>
<a href="https://genius.com/artists/post-malone">
<div style="text-align: center; font-size: 14px;">@post-malone</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Post Malone.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/post-malone).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/post-malone")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/5ig21wpy/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Post Malone's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2ih9ntzv) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2ih9ntzv/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/post-malone')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/post-malone")
model = AutoModelWithLMHead.from_pretrained("huggingartists/post-malone")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/post-malone"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/post-malone
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/post-malone",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/post-malone #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Post Malone</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@post-malone</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Post Malone.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Post Malone's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">pyrokinesis</div>
<a href="https://genius.com/artists/pyrokinesis">
<div style="text-align: center; font-size: 14px;">@pyrokinesis</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from pyrokinesis.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/pyrokinesis).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/pyrokinesis")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1s8696f3/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on pyrokinesis's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/22hm2utc) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/22hm2utc/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/pyrokinesis')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/pyrokinesis")
model = AutoModelWithLMHead.from_pretrained("huggingartists/pyrokinesis")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/pyrokinesis"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/pyrokinesis
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/pyrokinesis",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/pyrokinesis #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">pyrokinesis</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@pyrokinesis</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from pyrokinesis.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on pyrokinesis's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Queen</div>
<a href="https://genius.com/artists/queen">
<div style="text-align: center; font-size: 14px;">@queen</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Queen.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/queen).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/queen")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1jdprwq2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Queen's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2lvkoamo) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2lvkoamo/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/queen')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/queen")
model = AutoModelWithLMHead.from_pretrained("huggingartists/queen")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/queen"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/queen
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/queen",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/queen #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Queen</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@queen</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Queen.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Queen's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Radiohead</div>
<a href="https://genius.com/artists/radiohead">
<div style="text-align: center; font-size: 14px;">@radiohead</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Radiohead.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/radiohead).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/radiohead")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/35vxvq9n/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Radiohead's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2bulf32i) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2bulf32i/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/radiohead')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/radiohead")
model = AutoModelWithLMHead.from_pretrained("huggingartists/radiohead")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/radiohead"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/radiohead
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/radiohead",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/radiohead #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Radiohead</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@radiohead</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Radiohead.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Radiohead's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ramil’</div>
<a href="https://genius.com/artists/ramil">
<div style="text-align: center; font-size: 14px;">@ramil</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Ramil’.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/ramil).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/ramil")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1l1axl7k/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Ramil’'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/28boyxm8) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/28boyxm8/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/ramil')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/ramil")
model = AutoModelWithLMHead.from_pretrained("huggingartists/ramil")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/ramil"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/ramil
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/ramil",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/ramil #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Ramil’</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@ramil</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Ramil’.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Ramil’'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Rammstein</div>
<a href="https://genius.com/artists/rammstein">
<div style="text-align: center; font-size: 14px;">@rammstein</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Rammstein.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/rammstein).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/rammstein")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/qt3qa1x1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Rammstein's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2yyigjzv) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2yyigjzv/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/rammstein')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/rammstein")
model = AutoModelWithLMHead.from_pretrained("huggingartists/rammstein")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/rammstein"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/rammstein
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/rammstein",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/rammstein #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Rammstein</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@rammstein</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Rammstein.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Rammstein's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Red Hot Chili Peppers</div>
<a href="https://genius.com/artists/red-hot-chili-peppers">
<div style="text-align: center; font-size: 14px;">@red-hot-chili-peppers</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Red Hot Chili Peppers.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/red-hot-chili-peppers).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/red-hot-chili-peppers")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2spp06qm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Red Hot Chili Peppers's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/opiwx19q) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/opiwx19q/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/red-hot-chili-peppers')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/red-hot-chili-peppers")
model = AutoModelWithLMHead.from_pretrained("huggingartists/red-hot-chili-peppers")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/red-hot-chili-peppers"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/red-hot-chili-peppers
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/red-hot-chili-peppers",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/red-hot-chili-peppers #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Red Hot Chili Peppers</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@red-hot-chili-peppers</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Red Hot Chili Peppers.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Red Hot Chili Peppers's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Rex Orange County</div>
<a href="https://genius.com/artists/rex-orange-county">
<div style="text-align: center; font-size: 14px;">@rex-orange-county</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Rex Orange County.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/rex-orange-county).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/rex-orange-county")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3by3xc64/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Rex Orange County's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1bwctmad) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1bwctmad/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/rex-orange-county')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/rex-orange-county")
model = AutoModelWithLMHead.from_pretrained("huggingartists/rex-orange-county")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/rex-orange-county"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/rex-orange-county
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/rex-orange-county",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/rex-orange-county #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Rex Orange County</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@rex-orange-county</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Rex Orange County.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Rex Orange County's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Rihanna</div>
<a href="https://genius.com/artists/rihanna">
<div style="text-align: center; font-size: 14px;">@rihanna</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Rihanna.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/rihanna).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/rihanna")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/ee6eogks/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Rihanna's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1mvns7x8) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1mvns7x8/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/rihanna')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/rihanna")
model = AutoModelWithLMHead.from_pretrained("huggingartists/rihanna")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/rihanna"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/rihanna
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/rihanna",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/rihanna #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Rihanna</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@rihanna</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Rihanna.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Rihanna's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">ROCKET</div>
<a href="https://genius.com/artists/rocket">
<div style="text-align: center; font-size: 14px;">@rocket</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from ROCKET.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/rocket).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/rocket")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3ceqmb05/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on ROCKET's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/37kckftd) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/37kckftd/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/rocket')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/rocket")
model = AutoModelWithLMHead.from_pretrained("huggingartists/rocket")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/rocket"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/rocket
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/rocket",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/rocket #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">ROCKET</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@rocket</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from ROCKET.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on ROCKET's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Sam Kim (샘김)</div>
<a href="https://genius.com/artists/sam-kim">
<div style="text-align: center; font-size: 14px;">@sam-kim</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Sam Kim (샘김).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/sam-kim).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/sam-kim")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/38e0f1wf/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Sam Kim (샘김)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2rke2zbk) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2rke2zbk/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/sam-kim')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/sam-kim")
model = AutoModelWithLMHead.from_pretrained("huggingartists/sam-kim")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/sam-kim"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/sam-kim
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/sam-kim",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/sam-kim #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Sam Kim (샘김)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@sam-kim</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Sam Kim (샘김).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Sam Kim (샘김)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Sam Kim (샘김)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Sam Kim (샘김)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Sam Kim (샘김)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Скриптонит (Scriptonite)</div>
<a href="https://genius.com/artists/scriptonite">
<div style="text-align: center; font-size: 14px;">@scriptonite</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Скриптонит (Scriptonite).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/scriptonite).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/scriptonite")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/13pxeww0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Скриптонит (Scriptonite)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1itfp830) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1itfp830/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/scriptonite')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/scriptonite")
model = AutoModelWithLMHead.from_pretrained("huggingartists/scriptonite")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/scriptonite"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/scriptonite
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/scriptonite",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/scriptonite #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Скриптонит (Scriptonite)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@scriptonite</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Скриптонит (Scriptonite).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Скриптонит (Scriptonite)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Скриптонит (Scriptonite)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Скриптонит (Scriptonite)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Скриптонит (Scriptonite)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Сергей Летов (Sergei Letov)</div>
<a href="https://genius.com/artists/sergei-letov">
<div style="text-align: center; font-size: 14px;">@sergei-letov</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Сергей Летов (Sergei Letov).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/sergei-letov).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/sergei-letov")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1chw67j7/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Сергей Летов (Sergei Letov)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/my7m2jp6) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/my7m2jp6/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/sergei-letov')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/sergei-letov")
model = AutoModelWithLMHead.from_pretrained("huggingartists/sergei-letov")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/sergei-letov"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/sergei-letov
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/sergei-letov",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/sergei-letov #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Сергей Летов (Sergei Letov)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@sergei-letov</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Сергей Летов (Sergei Letov).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Сергей Летов (Sergei Letov)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Сергей Летов (Sergei Letov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Сергей Летов (Sergei Letov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Сергей Летов (Sergei Letov)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">shadowraze</div>
<a href="https://genius.com/artists/shadowraze">
<div style="text-align: center; font-size: 14px;">@shadowraze</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from shadowraze.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/shadowraze).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/shadowraze")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/pkbkflsq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on shadowraze's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/tiu2mjo1) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/tiu2mjo1/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/shadowraze')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/shadowraze")
model = AutoModelWithLMHead.from_pretrained("huggingartists/shadowraze")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/shadowraze"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/shadowraze
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/shadowraze",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/shadowraze #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">shadowraze</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@shadowraze</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from shadowraze.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on shadowraze's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Skillet</div>
<a href="https://genius.com/artists/skillet">
<div style="text-align: center; font-size: 14px;">@skillet</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Skillet.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/skillet).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/skillet")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1wmbkzn8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Skillet's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3jke6b6i) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3jke6b6i/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/skillet')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/skillet")
model = AutoModelWithLMHead.from_pretrained("huggingartists/skillet")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/skillet"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/skillet
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/skillet",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/skillet #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Skillet</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@skillet</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Skillet.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Skillet's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Слава КПСС (Slava KPSS)</div>
<a href="https://genius.com/artists/slava-kpss">
<div style="text-align: center; font-size: 14px;">@slava-kpss</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Слава КПСС (Slava KPSS).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/slava-kpss).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/slava-kpss")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2f2r3u3b/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Слава КПСС (Slava KPSS)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/pecxkpae) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/pecxkpae/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/slava-kpss')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/slava-kpss")
model = AutoModelWithLMHead.from_pretrained("huggingartists/slava-kpss")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/slava-kpss"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/slava-kpss
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/slava-kpss",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/slava-kpss #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Слава КПСС (Slava KPSS)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@slava-kpss</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Слава КПСС (Slava KPSS).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Слава КПСС (Slava KPSS)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Слава КПСС (Slava KPSS)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Слава КПСС (Slava KPSS)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Слава КПСС (Slava KPSS)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">SLAVA MARLOW</div>
<a href="https://genius.com/artists/slava-marlow">
<div style="text-align: center; font-size: 14px;">@slava-marlow</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from SLAVA MARLOW.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/slava-marlow).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/slava-marlow")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1fdcz1s5/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on SLAVA MARLOW's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/ro4q353s) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/ro4q353s/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/slava-marlow')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/slava-marlow")
model = AutoModelWithLMHead.from_pretrained("huggingartists/slava-marlow")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/slava-marlow"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/slava-marlow
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/slava-marlow",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/slava-marlow #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">SLAVA MARLOW</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@slava-marlow</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from SLAVA MARLOW.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on SLAVA MARLOW's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Snoop Dogg</div>
<a href="https://genius.com/artists/snoop-dogg">
<div style="text-align: center; font-size: 14px;">@snoop-dogg</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Snoop Dogg.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/snoop-dogg).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/snoop-dogg")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/xru6xdjl/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Snoop Dogg's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1o72aoie) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1o72aoie/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/snoop-dogg')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/snoop-dogg")
model = AutoModelWithLMHead.from_pretrained("huggingartists/snoop-dogg")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/snoop-dogg"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/snoop-dogg
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/snoop-dogg",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/snoop-dogg #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Snoop Dogg</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@snoop-dogg</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Snoop Dogg.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Snoop Dogg's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Sqwore</div>
<a href="https://genius.com/artists/sqwore">
<div style="text-align: center; font-size: 14px;">@sqwore</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Sqwore.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/sqwore).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/sqwore")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3gzd5crq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Sqwore's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/vzeft23g) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/vzeft23g/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/sqwore')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/sqwore")
model = AutoModelWithLMHead.from_pretrained("huggingartists/sqwore")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/sqwore"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/sqwore
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/sqwore",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/sqwore #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Sqwore</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@sqwore</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Sqwore.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Sqwore's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Sugar Ray</div>
<a href="https://genius.com/artists/sugar-ray">
<div style="text-align: center; font-size: 14px;">@sugar-ray</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Sugar Ray.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/sugar-ray).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/sugar-ray")
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/sugar-ray")
model = AutoModelWithLMHead.from_pretrained("huggingartists/sugar-ray")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/10440qj4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Sugar Ray's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2n3xk5nv) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2n3xk5nv/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/sugar-ray')
generator("I am", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/sugar-ray"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/sugar-ray
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/sugar-ray",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/sugar-ray #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Sugar Ray</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@sugar-ray</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Sugar Ray.
Dataset is available here.
And can be used with:
Or with Transformers library:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Sugar Ray's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Suicideoscope</div>
<a href="https://genius.com/artists/suicideoscope">
<div style="text-align: center; font-size: 14px;">@suicideoscope</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Suicideoscope.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/suicideoscope).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/suicideoscope")
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/suicideoscope")
model = AutoModelWithLMHead.from_pretrained("huggingartists/suicideoscope")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/17opu10a/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Suicideoscope's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2w46luqb) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2w46luqb/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/suicideoscope')
generator("I am", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/suicideoscope"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/suicideoscope
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/suicideoscope",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/suicideoscope #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Suicideoscope</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@suicideoscope</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Suicideoscope.
Dataset is available here.
And can be used with:
Or with Transformers library:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Suicideoscope's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Sum 41</div>
<a href="https://genius.com/artists/sum-41">
<div style="text-align: center; font-size: 14px;">@sum-41</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Sum 41.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/sum-41).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/sum-41")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3fy2kvn1/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Sum 41's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2hgx7kne) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2hgx7kne/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/sum-41')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/sum-41")
model = AutoModelWithLMHead.from_pretrained("huggingartists/sum-41")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/sum-41"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/sum-41
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/sum-41",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/sum-41 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Sum 41</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@sum-41</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Sum 41.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Sum 41's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">System of a Down</div>
<a href="https://genius.com/artists/system-of-a-down">
<div style="text-align: center; font-size: 14px;">@system-of-a-down</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from System of a Down.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/system-of-a-down).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/system-of-a-down")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3m1sikv8/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on System of a Down's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/wf3qe4yi) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/wf3qe4yi/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/system-of-a-down')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/system-of-a-down")
model = AutoModelWithLMHead.from_pretrained("huggingartists/system-of-a-down")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/system-of-a-down"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/system-of-a-down
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/system-of-a-down",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/system-of-a-down #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">System of a Down</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@system-of-a-down</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from System of a Down.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on System of a Down's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Танцы Минус (Tanzy Minus)</div>
<a href="https://genius.com/artists/tanzy-minus">
<div style="text-align: center; font-size: 14px;">@tanzy-minus</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Танцы Минус (Tanzy Minus).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/tanzy-minus).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/tanzy-minus")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/14vmwaxq/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Танцы Минус (Tanzy Minus)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/ru5wxieh) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/ru5wxieh/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/tanzy-minus')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/tanzy-minus")
model = AutoModelWithLMHead.from_pretrained("huggingartists/tanzy-minus")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/tanzy-minus"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/tanzy-minus
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/tanzy-minus",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/tanzy-minus #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Танцы Минус (Tanzy Minus)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@tanzy-minus</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Танцы Минус (Tanzy Minus).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Танцы Минус (Tanzy Minus)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Танцы Минус (Tanzy Minus)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Танцы Минус (Tanzy Minus)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Танцы Минус (Tanzy Minus)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Taylor Swift</div>
<a href="https://genius.com/artists/taylor-swift">
<div style="text-align: center; font-size: 14px;">@taylor-swift</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Taylor Swift.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/taylor-swift).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/taylor-swift")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2l84tzp2/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Taylor Swift's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1hy7aa65) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1hy7aa65/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/taylor-swift')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/taylor-swift")
model = AutoModelWithLMHead.from_pretrained("huggingartists/taylor-swift")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/taylor-swift"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/taylor-swift
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/taylor-swift",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/taylor-swift #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Taylor Swift</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@taylor-swift</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Taylor Swift.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Taylor Swift's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The 69 Eyes</div>
<a href="https://genius.com/artists/the-69-eyes">
<div style="text-align: center; font-size: 14px;">@the-69-eyes</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from The 69 Eyes.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-69-eyes).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-69-eyes")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/26sibipb/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on The 69 Eyes's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1mjcdm16) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1mjcdm16/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-69-eyes')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-69-eyes")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-69-eyes")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-69-eyes"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-69-eyes
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-69-eyes",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-69-eyes #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The 69 Eyes</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-69-eyes</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from The 69 Eyes.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on The 69 Eyes's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Beatles</div>
<a href="https://genius.com/artists/the-beatles">
<div style="text-align: center; font-size: 14px;">@the-beatles</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from The Beatles.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-beatles).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-beatles")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2p2c5864/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on The Beatles's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/286vzjah) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/286vzjah/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-beatles')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-beatles")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-beatles")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-beatles"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-beatles
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-beatles",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-beatles #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Beatles</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-beatles</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from The Beatles.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on The Beatles's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Gazette</div>
<a href="https://genius.com/artists/the-gazette">
<div style="text-align: center; font-size: 14px;">@the-gazette</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from The Gazette.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-gazette).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-gazette")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/3ck1sdfv/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on The Gazette's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/m1wevlws) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/m1wevlws/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-gazette')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-gazette")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-gazette")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-gazette"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-gazette
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-gazette",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-gazette #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Gazette</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-gazette</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from The Gazette.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on The Gazette's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Grateful Dead</div>
<a href="https://genius.com/artists/the-grateful-dead">
<div style="text-align: center; font-size: 14px;">@the-grateful-dead</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from The Grateful Dead.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-grateful-dead).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-grateful-dead")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2agvlyoo/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on The Grateful Dead's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1ex4c8kc) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1ex4c8kc/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-grateful-dead')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-grateful-dead")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-grateful-dead")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-grateful-dead"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-grateful-dead
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-grateful-dead",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-grateful-dead #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Grateful Dead</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-grateful-dead</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from The Grateful Dead.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on The Grateful Dead's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Король и Шут (The King and the Jester)</div>
<a href="https://genius.com/artists/the-king-and-the-jester">
<div style="text-align: center; font-size: 14px;">@the-king-and-the-jester</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Король и Шут (The King and the Jester).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-king-and-the-jester).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-king-and-the-jester")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1qw2ic95/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Король и Шут (The King and the Jester)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/hhhj9047) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/hhhj9047/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-king-and-the-jester')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-king-and-the-jester")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-king-and-the-jester")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-king-and-the-jester"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-king-and-the-jester
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-king-and-the-jester",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-king-and-the-jester #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Король и Шут (The King and the Jester)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-king-and-the-jester</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Король и Шут (The King and the Jester).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Король и Шут (The King and the Jester)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Король и Шут (The King and the Jester)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Король и Шут (The King and the Jester)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Король и Шут (The King and the Jester)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Notorious B.I.G.</div>
<a href="https://genius.com/artists/the-notorious-big">
<div style="text-align: center; font-size: 14px;">@the-notorious-big</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from The Notorious B.I.G..
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-notorious-big).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-notorious-big")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/wkvasju4/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on The Notorious B.I.G.'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1coezuy2) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1coezuy2/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-notorious-big')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-notorious-big")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-notorious-big")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-notorious-big"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-notorious-big
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-notorious-big",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-notorious-big #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Notorious B.I.G.</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-notorious-big</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from The Notorious B.I.G..
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on The Notorious B.I.G.'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Sugarcubes</div>
<a href="https://genius.com/artists/the-sugarcubes">
<div style="text-align: center; font-size: 14px;">@the-sugarcubes</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from The Sugarcubes.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-sugarcubes).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-sugarcubes")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1zrlgv5f/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on The Sugarcubes's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/24shllae) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/24shllae/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-sugarcubes')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-sugarcubes")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-sugarcubes")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-sugarcubes"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-sugarcubes
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-sugarcubes",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-sugarcubes #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Sugarcubes</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-sugarcubes</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from The Sugarcubes.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on The Sugarcubes's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The ‘’Вепри’’ (The Pigs)</div>
<a href="https://genius.com/artists/the-the-pigs">
<div style="text-align: center; font-size: 14px;">@the-the-pigs</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from The ‘’Вепри’’ (The Pigs).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-the-pigs).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-the-pigs")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/7yh65db9/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on The ‘’Вепри’’ (The Pigs)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/65gj1lk1) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/65gj1lk1/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-the-pigs')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-the-pigs")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-the-pigs")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-the-pigs"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-the-pigs
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-the-pigs",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-the-pigs #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The ‘’Вепри’’ (The Pigs)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-the-pigs</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from The ‘’Вепри’’ (The Pigs).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on The ‘’Вепри’’ (The Pigs)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on The ‘’Вепри’’ (The Pigs)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on The ‘’Вепри’’ (The Pigs)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n.\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on The ‘’Вепри’’ (The Pigs)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Velvet Underground</div>
<a href="https://genius.com/artists/the-velvet-underground">
<div style="text-align: center; font-size: 14px;">@the-velvet-underground</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from The Velvet Underground.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-velvet-underground).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-velvet-underground")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/lbkqy84q/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on The Velvet Underground's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1e4s74q4) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1e4s74q4/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-velvet-underground')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-velvet-underground")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-velvet-underground")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-velvet-underground"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-velvet-underground
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-velvet-underground",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-velvet-underground #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Velvet Underground</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-velvet-underground</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from The Velvet Underground.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on The Velvet Underground's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Weeknd</div>
<a href="https://genius.com/artists/the-weeknd">
<div style="text-align: center; font-size: 14px;">@the-weeknd</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from The Weeknd.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/the-weeknd).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/the-weeknd")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/34tqtrsm/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on The Weeknd's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1pjby702) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1pjby702/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/the-weeknd')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/the-weeknd")
model = AutoModelWithLMHead.from_pretrained("huggingartists/the-weeknd")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/the-weeknd"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/the-weeknd
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/the-weeknd",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/the-weeknd #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">The Weeknd</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@the-weeknd</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from The Weeknd.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on The Weeknd's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Tiamat</div>
<a href="https://genius.com/artists/tiamat">
<div style="text-align: center; font-size: 14px;">@tiamat</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Tiamat.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/tiamat).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/tiamat")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1tqzwb4a/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Tiamat's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/ttkys3mq) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/ttkys3mq/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/tiamat')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/tiamat")
model = AutoModelWithLMHead.from_pretrained("huggingartists/tiamat")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/tiamat"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/tiamat
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/tiamat",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/tiamat #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Tiamat</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@tiamat</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Tiamat.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Tiamat's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Till Lindemann</div>
<a href="https://genius.com/artists/till-lindemann">
<div style="text-align: center; font-size: 14px;">@till-lindemann</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Till Lindemann.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/till-lindemann).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/till-lindemann")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2xh6fyqt/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Till Lindemann's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/32ohf092) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/32ohf092/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/till-lindemann')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/till-lindemann")
model = AutoModelWithLMHead.from_pretrained("huggingartists/till-lindemann")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/till-lindemann"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/till-lindemann
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/till-lindemann",
"autotrain_compatible",
"endpoints_compatible",
"has_space",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/till-lindemann #autotrain_compatible #endpoints_compatible #has_space #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Till Lindemann</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@till-lindemann</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Till Lindemann.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Till Lindemann's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Tom Waits</div>
<a href="https://genius.com/artists/tom-waits">
<div style="text-align: center; font-size: 14px;">@tom-waits</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Tom Waits.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/tom-waits).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/tom-waits")
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/tom-waits")
model = AutoModelWithLMHead.from_pretrained("huggingartists/tom-waits")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/216zw2jw/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Tom Waits's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/16iei9vt) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/16iei9vt/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/tom-waits')
generator("I am", num_return_sequences=5)
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/tom-waits"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/tom-waits
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/tom-waits",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/tom-waits #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Tom Waits</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@tom-waits</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Tom Waits.
Dataset is available here.
And can be used with:
Or with Transformers library:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Tom Waits's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Тони Раут (Tony Raut) & Гарри Топор (Garry Topor)</div>
<a href="https://genius.com/artists/tony-raut-and-garry-topor">
<div style="text-align: center; font-size: 14px;">@tony-raut-and-garry-topor</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Тони Раут (Tony Raut) & Гарри Топор (Garry Topor).
Dataset is available [here](https://huggingface.co/datasets/huggingartists/tony-raut-and-garry-topor).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/tony-raut-and-garry-topor")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/xnzxet17/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Тони Раут (Tony Raut) & Гарри Топор (Garry Topor)'s lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/tfby1rj2) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/tfby1rj2/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/tony-raut-and-garry-topor')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/tony-raut-and-garry-topor")
model = AutoModelWithLMHead.from_pretrained("huggingartists/tony-raut-and-garry-topor")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/tony-raut-and-garry-topor"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/tony-raut-and-garry-topor
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/tony-raut-and-garry-topor",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/tony-raut-and-garry-topor #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Тони Раут (Tony Raut) & Гарри Топор (Garry Topor)</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@tony-raut-and-garry-topor</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Тони Раут (Tony Raut) & Гарри Топор (Garry Topor).
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Тони Раут (Tony Raut) & Гарри Топор (Garry Topor)'s lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
 & Гарри Топор (Garry Topor).\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Тони Раут (Tony Raut) & Гарри Топор (Garry Topor)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n & Гарри Топор (Garry Topor).\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.",
"## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Тони Раут (Tony Raut) & Гарри Топор (Garry Topor)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.",
"## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:",
"## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.",
"## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n & Гарри Топор (Garry Topor).\n\nDataset is available here.\nAnd can be used with:\n\n\n\nExplore the data, which is tracked with W&B artifacts at every step of the pipeline.## Training procedure\n\nThe model is based on a pre-trained GPT-2 which is fine-tuned on Тони Раут (Tony Raut) & Гарри Топор (Garry Topor)'s lyrics.\n\nHyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.\n\nAt the end of training, the final model is logged and versioned.## How to use\n\nYou can use this model directly with a pipeline for text generation:\n\n\n\nOr with Transformers library:## Limitations and bias\n\nThe model suffers from the same limitations and bias as GPT-2.\n\nIn addition, the data present in the user's tweets further affects the text generated by the model.## About\n\n*Built by Aleksey Korshuk*\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Tool</div>
<a href="https://genius.com/artists/tool">
<div style="text-align: center; font-size: 14px;">@tool</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Tool.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/tool).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/tool")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2w1h70ok/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Tool's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1zikehwi) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1zikehwi/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/tool')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/tool")
model = AutoModelWithLMHead.from_pretrained("huggingartists/tool")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/tool"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/tool
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/tool",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/tool #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Tool</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@tool</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Tool.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Tool's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Travis Scott</div>
<a href="https://genius.com/artists/travis-scott">
<div style="text-align: center; font-size: 14px;">@travis-scott</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from Travis Scott.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/travis-scott).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/travis-scott")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/1ezlbvd0/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on Travis Scott's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2w91gglb) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2w91gglb/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/travis-scott')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/travis-scott")
model = AutoModelWithLMHead.from_pretrained("huggingartists/travis-scott")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/travis-scott"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/travis-scott
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/travis-scott",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/travis-scott #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">Travis Scott</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@travis-scott</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from Travis Scott.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on Travis Scott's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">twenty one pilots</div>
<a href="https://genius.com/artists/twenty-one-pilots">
<div style="text-align: center; font-size: 14px;">@twenty-one-pilots</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from twenty one pilots.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/twenty-one-pilots).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/twenty-one-pilots")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2wr3j4nk/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on twenty one pilots's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/3jhgvd5t) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/3jhgvd5t/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/twenty-one-pilots')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/twenty-one-pilots")
model = AutoModelWithLMHead.from_pretrained("huggingartists/twenty-one-pilots")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/twenty-one-pilots"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/twenty-one-pilots
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/twenty-one-pilots",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/twenty-one-pilots #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">twenty one pilots</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@twenty-one-pilots</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from twenty one pilots.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on twenty one pilots's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">UPSAHL</div>
<a href="https://genius.com/artists/upsahl">
<div style="text-align: center; font-size: 14px;">@upsahl</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from UPSAHL.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/upsahl).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/upsahl")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/2o3af3ts/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on UPSAHL's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/2lr9eqkt) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/2lr9eqkt/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/upsahl')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/upsahl")
model = AutoModelWithLMHead.from_pretrained("huggingartists/upsahl")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/upsahl"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/upsahl
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/upsahl",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/upsahl #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">UPSAHL</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@upsahl</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from UPSAHL.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on UPSAHL's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n">
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800">🤖 HuggingArtists Model 🤖</div>
<div style="text-align: center; font-size: 16px; font-weight: 800">V $ X V PRiNCE</div>
<a href="https://genius.com/artists/v-x-v-prince">
<div style="text-align: center; font-size: 14px;">@v-x-v-prince</div>
</a>
</div>
I was made with [huggingartists](https://github.com/AlekseyKorshuk/huggingartists).
Create your own bot based on your favorite artist with [the demo](https://colab.research.google.com/github/AlekseyKorshuk/huggingartists/blob/master/huggingartists-demo.ipynb)!
## How does it work?
To understand how the model was developed, check the [W&B report](https://wandb.ai/huggingartists/huggingartists/reportlist).
## Training data
The model was trained on lyrics from V $ X V PRiNCE.
Dataset is available [here](https://huggingface.co/datasets/huggingartists/v-x-v-prince).
And can be used with:
```python
from datasets import load_dataset
dataset = load_dataset("huggingartists/v-x-v-prince")
```
[Explore the data](https://wandb.ai/huggingartists/huggingartists/runs/a6qdzbfe/artifacts), which is tracked with [W&B artifacts](https://docs.wandb.com/artifacts) at every step of the pipeline.
## Training procedure
The model is based on a pre-trained [GPT-2](https://huggingface.co/gpt2) which is fine-tuned on V $ X V PRiNCE's lyrics.
Hyperparameters and metrics are recorded in the [W&B training run](https://wandb.ai/huggingartists/huggingartists/runs/1rv03n56) for full transparency and reproducibility.
At the end of training, [the final model](https://wandb.ai/huggingartists/huggingartists/runs/1rv03n56/artifacts) is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
```python
from transformers import pipeline
generator = pipeline('text-generation',
model='huggingartists/v-x-v-prince')
generator("I am", num_return_sequences=5)
```
Or with Transformers library:
```python
from transformers import AutoTokenizer, AutoModelWithLMHead
tokenizer = AutoTokenizer.from_pretrained("huggingartists/v-x-v-prince")
model = AutoModelWithLMHead.from_pretrained("huggingartists/v-x-v-prince")
```
## Limitations and bias
The model suffers from [the same limitations and bias as GPT-2](https://huggingface.co/gpt2#limitations-and-bias).
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*
[](https://github.com/AlekseyKorshuk)
[](https://twitter.com/intent/follow?screen_name=alekseykorshuk)
[](https://t.me/joinchat/_CQ04KjcJ-4yZTky)
For more details, visit the project repository.
[](https://github.com/AlekseyKorshuk/huggingartists)
|
{"language": "en", "tags": ["huggingartists", "lyrics", "lm-head", "causal-lm"], "datasets": ["huggingartists/v-x-v-prince"], "widget": [{"text": "I am"}]}
|
text-generation
|
huggingartists/v-x-v-prince
|
[
"transformers",
"pytorch",
"jax",
"gpt2",
"text-generation",
"huggingartists",
"lyrics",
"lm-head",
"causal-lm",
"en",
"dataset:huggingartists/v-x-v-prince",
"autotrain_compatible",
"endpoints_compatible",
"text-generation-inference",
"region:us"
] |
2022-03-02T23:29:05+00:00
|
[] |
[
"en"
] |
TAGS
#transformers #pytorch #jax #gpt2 #text-generation #huggingartists #lyrics #lm-head #causal-lm #en #dataset-huggingartists/v-x-v-prince #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
|
<div class="inline-flex flex-col" style="line-height: 1.5;">
<div class="flex">
<div
style="display:DISPLAY_1; margin-left: auto; margin-right: auto; width: 92px; height:92px; border-radius: 50%; background-size: cover; background-image: url('URL
</div>
</div>
<div style="text-align: center; margin-top: 3px; font-size: 16px; font-weight: 800"> HuggingArtists Model </div>
<div style="text-align: center; font-size: 16px; font-weight: 800">V $ X V PRiNCE</div>
<a href="URL
<div style="text-align: center; font-size: 14px;">@v-x-v-prince</div>
</a>
</div>
I was made with huggingartists.
Create your own bot based on your favorite artist with the demo!
## How does it work?
To understand how the model was developed, check the W&B report.
## Training data
The model was trained on lyrics from V $ X V PRiNCE.
Dataset is available here.
And can be used with:
Explore the data, which is tracked with W&B artifacts at every step of the pipeline.
## Training procedure
The model is based on a pre-trained GPT-2 which is fine-tuned on V $ X V PRiNCE's lyrics.
Hyperparameters and metrics are recorded in the W&B training run for full transparency and reproducibility.
At the end of training, the final model is logged and versioned.
## How to use
You can use this model directly with a pipeline for text generation:
Or with Transformers library:
## Limitations and bias
The model suffers from the same limitations and bias as GPT-2.
In addition, the data present in the user's tweets further affects the text generated by the model.
## About
*Built by Aleksey Korshuk*

For more details, visit the project repository.
\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n\n\nFor more details, visit the project repository.\n\n![GitHub stars](URL"
] |
[
-0.03603776916861534,
0.16601817309856415,
-0.005861896555870771,
0.0813303142786026,
0.07054542750120163,
0.006382000166922808,
0.06384332478046417,
0.11561852693557739,
-0.011414053849875927,
0.06164786219596863,
0.06796319782733917,
0.046794112771749496,
0.07311400771141052,
0.10594594478607178,
0.0218881294131279,
-0.2437688559293747,
0.015773624181747437,
-0.056444570422172546,
-0.018140500411391258,
0.09458603709936142,
0.09035835415124893,
-0.08122652769088745,
0.05627338960766792,
0.0015266104601323605,
-0.042457181960344315,
-0.007674716413021088,
0.0006088611553423107,
-0.05571898818016052,
0.06077658757567406,
0.08778508752584457,
0.058959949761629105,
0.04274430871009827,
0.06168889254331589,
-0.1756102591753006,
0.028943754732608795,
0.11442381888628006,
0.028711140155792236,
0.09084776788949966,
0.06764670461416245,
-0.051549118012189865,
0.10722610354423523,
-0.04609858989715576,
0.10213717073202133,
0.04041297361254692,
-0.10683213174343109,
-0.12563808262348175,
-0.11781001836061478,
0.04350556805729866,
0.04688642546534538,
0.05629855394363403,
-0.014079662971198559,
0.07389310002326965,
-0.0641942024230957,
0.07834076881408691,
0.2399088591337204,
-0.24935133755207062,
-0.011421246454119682,
0.013863345608115196,
0.05217263102531433,
0.051571376621723175,
-0.07796350866556168,
-0.005861253477632999,
0.022644855082035065,
0.04942706599831581,
0.07518479228019714,
-0.03861880302429199,
0.07912033051252365,
-0.023378638550639153,
-0.11071096360683441,
-0.07795467972755432,
0.08377037197351456,
-0.02968444675207138,
-0.08046646416187286,
-0.10592734813690186,
-0.03309467062354088,
-0.06855400651693344,
0.030458591878414154,
0.0194552019238472,
0.008075891062617302,
-0.006023016758263111,
-0.06480320543050766,
-0.11909765750169754,
-0.05547982454299927,
-0.07008187472820282,
-0.029992474243044853,
0.07682126015424728,
0.05665087327361107,
0.03782451152801514,
-0.05648190528154373,
0.19932430982589722,
0.08804155141115189,
-0.11171078681945801,
-0.09036046266555786,
-0.10345115512609482,
-0.10163198411464691,
-0.03607747703790665,
0.0005545318126678467,
0.02322988398373127,
-0.009075189009308815,
0.16146163642406464,
-0.04170181602239609,
0.013833996839821339,
-0.06563825160264969,
0.031313490122556686,
0.10131429880857468,
0.12274304777383804,
-0.09529852867126465,
-0.025022177025675774,
0.09778948873281479,
-0.023823097348213196,
-0.03320880979299545,
-0.0475277304649353,
-0.012309986166656017,
-0.07054696977138519,
0.04899728298187256,
0.1120448112487793,
0.06546913087368011,
0.0428602397441864,
-0.04130939766764641,
-0.04756830632686615,
0.031981244683265686,
-0.152972012758255,
0.027774721384048462,
0.02194460667669773,
-0.058875638991594315,
-0.007053229492157698,
0.07885240018367767,
-0.04787447303533554,
-0.12679989635944366,
0.06679576635360718,
-0.05134724825620651,
-0.024812845513224602,
-0.08770814538002014,
-0.11514931172132492,
0.01093772891908884,
-0.024781929329037666,
-0.024365903809666634,
-0.08392810076475143,
-0.15436440706253052,
-0.025210093706846237,
0.010003320872783661,
-0.07285027205944061,
-0.016702227294445038,
0.010756796225905418,
-0.024145670235157013,
-0.01480345893651247,
-0.019830754026770592,
0.04675498977303505,
-0.02758239395916462,
0.06967467814683914,
-0.08219167590141296,
0.07446441799402237,
0.06417367607355118,
0.03289647772908211,
-0.10213075578212738,
0.0740569606423378,
-0.11945807933807373,
0.18070243299007416,
-0.03351936116814613,
-0.05854307860136032,
-0.08005597442388535,
-0.10534840077161789,
-0.01240173727273941,
-0.03355807811021805,
0.04213006794452667,
0.10788927972316742,
-0.16809402406215668,
-0.011127996258437634,
0.2666004002094269,
-0.07313782721757889,
-0.051413342356681824,
0.09596085548400879,
-0.0815625861287117,
0.0348723866045475,
0.11094371974468231,
0.027268139645457268,
0.13076071441173553,
-0.07266239821910858,
-0.06164301186800003,
-0.01732601597905159,
-0.0492212250828743,
0.17084598541259766,
0.054752595722675323,
-0.008930397219955921,
0.04361509531736374,
-0.009649419225752354,
-0.028163140639662743,
-0.002311797346919775,
-0.03939636051654816,
-0.04169280081987381,
-0.015138496644794941,
-0.021817589178681374,
0.017779165878891945,
-0.0481281504034996,
-0.018986284732818604,
-0.010183840058743954,
-0.10420611500740051,
0.07109999656677246,
0.11028413474559784,
-0.09028754383325577,
0.04487426578998566,
-0.11374633759260178,
0.03654263913631439,
-0.05245201662182808,
0.021171750500798225,
-0.19445517659187317,
-0.0004719804273918271,
0.024926844984292984,
-0.06372858583927155,
0.054644227027893066,
0.022960126399993896,
0.04622635617852211,
0.08699940890073776,
-0.03876917064189911,
0.006196357775479555,
-0.05401674285531044,
-0.0000990495755104348,
-0.05031079053878784,
-0.17401322722434998,
-0.053956709802150726,
-0.05424130707979202,
0.03125723451375961,
-0.11872364580631256,
-0.004103847313672304,
0.11573798954486847,
0.14011861383914948,
0.047679364681243896,
-0.052387770265340805,
0.03741703927516937,
-0.0035038224887102842,
-0.012301227077841759,
-0.10298272222280502,
-0.04046396538615227,
0.016908787190914154,
-0.04779314622282982,
0.12306904047727585,
-0.14475445449352264,
-0.055295515805482864,
0.11177755892276764,
0.06706412136554718,
-0.09166312217712402,
0.023908358067274094,
-0.04775204882025719,
-0.014464513398706913,
-0.07504518330097198,
-0.04134427756071091,
0.21542829275131226,
0.046385787427425385,
0.09979089349508286,
-0.1039297878742218,
-0.07012210041284561,
-0.023690873757004738,
-0.028080550953745842,
0.029430953785777092,
0.08047103136777878,
0.009163293056190014,
-0.2026165872812271,
0.04408137872815132,
0.0015850939089432359,
0.020102282986044884,
0.18358951807022095,
0.008378990925848484,
-0.10804585367441177,
-0.06441569328308105,
-0.021199747920036316,
0.027732867747545242,
0.07533391565084457,
-0.008277710527181625,
0.05191412195563316,
0.03465740755200386,
0.029395664110779762,
0.024570005014538765,
-0.11781497299671173,
0.024285119026899338,
0.06133318692445755,
-0.053354158997535706,
-0.04303867742419243,
-0.020686903968453407,
0.030446277931332588,
0.09256424754858017,
0.057017795741558075,
0.08864786475896835,
-0.025790033861994743,
-0.05402281880378723,
-0.11428466439247131,
0.1410554200410843,
-0.09534793347120285,
-0.1949530392885208,
-0.10206089913845062,
-0.05518631264567375,
0.048223428428173065,
0.018572209402918816,
0.02111697755753994,
-0.0009207890252582729,
-0.03452620655298233,
-0.0863063856959343,
0.040951017290353775,
-0.01859678141772747,
-0.010128369554877281,
0.0009438584093004465,
0.05173790082335472,
0.0046567367389798164,
-0.13002458214759827,
-0.008917943574488163,
0.026962043717503548,
-0.09791465103626251,
-0.0007700232090428472,
0.0500648096203804,
0.029300127178430557,
0.11119165271520615,
-0.009778032079339027,
0.006877250503748655,
-0.026207560673356056,
0.23406808078289032,
-0.10360052436590195,
0.029688062146306038,
0.14246203005313873,
-0.07308875769376755,
0.05060095340013504,
0.07395393401384354,
0.009167340584099293,
-0.05467309057712555,
0.04101703315973282,
0.024819767102599144,
-0.0630752444267273,
-0.20980478823184967,
-0.005928102880716324,
-0.045939184725284576,
0.008060789667069912,
0.11324743926525116,
0.04586900770664215,
-0.036016035825014114,
0.01823105476796627,
-0.09096208959817886,
0.07501468062400818,
0.07662145048379898,
0.08283039182424545,
-0.017530102282762527,
-0.013086353428661823,
0.0649615153670311,
-0.06246845796704292,
0.03283655643463135,
0.09954570233821869,
0.04133020341396332,
0.2519700527191162,
-0.1093023419380188,
0.11162098497152328,
0.08313167840242386,
0.06811519712209702,
0.050436872988939285,
0.03897213190793991,
-0.03106706589460373,
0.06114891543984413,
-0.0014088883763179183,
-0.09641196578741074,
-0.03490319475531578,
0.024974750354886055,
0.02301696501672268,
-0.010859857313334942,
-0.011631324887275696,
-0.08126450330018997,
0.046567391604185104,
0.20308038592338562,
0.07265384495258331,
-0.15598441660404205,
-0.0891147255897522,
0.06996646523475647,
-0.047100502997636795,
-0.05922064930200577,
-0.018905259668827057,
0.1112784594297409,
-0.19442535936832428,
-0.005815558135509491,
-0.011648979038000107,
0.12343208491802216,
-0.20360401272773743,
-0.024783417582511902,
-0.012169335968792439,
0.07436055690050125,
-0.054461169987916946,
0.061645571142435074,
-0.22649940848350525,
0.05701672285795212,
0.010771500878036022,
0.11786329001188278,
-0.05166066437959671,
0.0355815663933754,
0.05679262429475784,
0.018612941727042198,
0.09374906122684479,
0.013249583542346954,
0.024816518649458885,
-0.10670475661754608,
-0.07473398000001907,
0.021291691809892654,
0.03591960668563843,
-0.05283107981085777,
0.10409820824861526,
-0.026381181553006172,
0.014394750818610191,
-0.012692755088210106,
-0.09283817559480667,
-0.09341944009065628,
-0.14646555483341217,
0.05525035411119461,
-0.13635525107383728,
0.03384096920490265,
-0.052702952176332474,
-0.03297474980354309,
0.0489061065018177,
0.17269708216190338,
-0.13224662840366364,
-0.10960071533918381,
-0.09906820207834244,
0.04368240758776665,
0.12510286271572113,
-0.08368536829948425,
0.042543813586235046,
0.02674603834748268,
0.13453848659992218,
0.011154447682201862,
-0.10355764627456665,
0.0026783368084579706,
-0.04275751858949661,
-0.20645812153816223,
-0.019694846123456955,
0.09098118543624878,
0.07682491093873978,
0.06974456459283829,
0.01478347647935152,
0.009624805301427841,
-0.018652671948075294,
-0.1453578770160675,
0.002472706139087677,
0.13230562210083008,
0.042256228625774384,
0.04384949430823326,
0.0011402483796700835,
0.02669125609099865,
-0.09900355339050293,
0.020960921421647072,
0.09005015343427658,
0.24846334755420685,
-0.07978672534227371,
0.13902340829372406,
0.03374859318137169,
-0.10528828948736191,
-0.17852123081684113,
-0.004319888539612293,
0.019060740247368813,
0.040130048990249634,
0.01325609814375639,
-0.2070455551147461,
-0.014058229513466358,
0.05479935184121132,
-0.0035028578713536263,
0.12976166605949402,
-0.31565606594085693,
-0.13120535016059875,
0.06632683426141739,
0.05068308487534523,
-0.06889592856168747,
-0.03876742720603943,
-0.051619287580251694,
-0.07621420919895172,
-0.1875128298997879,
0.13162370026111603,
-0.11265046894550323,
0.10526271164417267,
0.019413670524954796,
-0.023233698680996895,
0.0344291590154171,
-0.044321004301309586,
0.13479892909526825,
-0.08671444654464722,
0.05568065121769905,
-0.08044137805700302,
-0.006398480385541916,
0.04671120643615723,
-0.04653606936335564,
0.05055399239063263,
0.014702867716550827,
0.08068564534187317,
-0.036886755377054214,
-0.07098580151796341,
-0.06769762188196182,
0.01393513847142458,
-0.058065567165613174,
-0.083427295088768,
-0.08686232566833496,
0.09313586354255676,
0.11046057194471359,
-0.029428979381918907,
-0.10126931965351105,
-0.05508529767394066,
-0.06396985799074173,
0.08918339759111404,
0.1125507578253746,
0.1078924685716629,
-0.07285147160291672,
-0.002607189118862152,
-0.011184281669557095,
0.07105718553066254,
-0.09074375033378601,
0.06381720304489136,
0.07947196066379547,
0.03443341329693794,
0.11658716201782227,
0.02992737479507923,
-0.1469041407108307,
0.05004347115755081,
0.018124641850590706,
-0.11828778684139252,
-0.11199790984392166,
-0.005148930940777063,
-0.03407525271177292,
-0.07317108660936356,
-0.06876033544540405,
0.13910344243049622,
-0.0300563033670187,
-0.04052838683128357,
0.030281968414783478,
0.05105849355459213,
-0.03547050058841705,
0.12996549904346466,
0.06369943916797638,
0.03942200168967247,
-0.08387964218854904,
0.08348607271909714,
0.07299971580505371,
0.028121445327997208,
0.03856445476412773,
0.06828310340642929,
-0.0956193283200264,
0.0025490131229162216,
-0.07427435368299484,
0.01665278524160385,
0.019391953945159912,
-0.00860652606934309,
-0.029046587646007538,
-0.05220624431967735,
0.04540928453207016,
0.12241196632385254,
-0.0005226977518759668,
0.12338636070489883,
-0.02984451688826084,
0.022722899913787842,
-0.10026897490024567,
0.10050132870674133,
0.04373596981167793,
0.02341814711689949,
-0.06086096912622452,
0.17752109467983246,
0.03329391032457352,
0.07622726261615753,
-0.040436360985040665,
-0.050141219049692154,
-0.08365336805582047,
0.011429890058934689,
-0.20782019197940826,
0.008415380492806435,
-0.07613468915224075,
-0.026295747607946396,
-0.022758038714528084,
-0.02835192158818245,
-0.02729662135243416,
0.056090254336595535,
-0.03686636686325073,
-0.05241350457072258,
-0.047320492565631866,
0.027824582532048225,
-0.16776585578918457,
-0.031800996512174606,
0.10449409484863281,
-0.09767947345972061,
0.11229049414396286,
0.05565071105957031,
-0.049957845360040665,
-0.006492517422884703,
-0.0928889662027359,
-0.0010560768423601985,
-0.018839117139577866,
0.017715472728013992,
0.025377696380019188,
-0.13683970272541046,
0.0332505963742733,
-0.04608598351478577,
-0.04791995510458946,
0.014239124953746796,
0.06411435455083847,
-0.11584609001874924,
0.017161201685667038,
0.03968936204910278,
0.012443047016859055,
-0.06455289572477341,
0.08836434036493301,
0.04187735170125961,
0.06053335219621658,
0.07007183134555817,
-0.030555864796042442,
0.10179081559181213,
-0.16588884592056274,
-0.037408020347356796,
0.027487965300679207,
-0.001125678070820868,
0.05601227283477783,
-0.006925591733306646,
0.06733974069356918,
-0.03494667634367943,
0.19580452144145966,
-0.024861792102456093,
-0.04497416689991951,
0.028778864070773125,
-0.030226431787014008,
-0.007070634514093399,
0.04538595676422119,
0.07893014699220657,
-0.017892247065901756,
-0.037075262516736984,
-0.027596820145845413,
-0.012580824084579945,
-0.038481246680021286,
-0.027538742870092392,
0.1272958368062973,
0.06911367177963257,
0.15667560696601868,
-0.02942388691008091,
0.0407988615334034,
-0.008283659815788269,
-0.10618695616722107,
-0.04416806250810623,
0.010660974308848381,
0.029181281104683876,
-0.05885325372219086,
0.07325665652751923,
0.13411204516887665,
-0.1527521163225174,
0.12219048291444778,
0.023759372532367706,
-0.07610844075679779,
-0.12115144729614258,
-0.1877029985189438,
-0.031101226806640625,
0.0045532588846981525,
0.018746107816696167,
-0.1296328902244568,
0.05979187414050102,
0.060478419065475464,
0.03787396103143692,
-0.04719911143183708,
0.09167610108852386,
-0.008948666974902153,
-0.11012629419565201,
0.019525911659002304,
0.028308721259236336,
0.04543212801218033,
0.024684766307473183,
0.02688831090927124,
0.050577543675899506,
0.04052857682108879,
0.06373068690299988,
0.05672500282526016,
0.030598631128668785,
-0.0017364701488986611,
-0.017318621277809143,
-0.048711881041526794,
0.008044637739658356,
0.024284398183226585,
0.027492383494973183,
0.16743628680706024,
0.0714145377278328,
-0.014869516715407372,
-0.020146925002336502,
0.2853147089481354,
-0.036593928933143616,
-0.07980416715145111,
-0.17557059228420258,
0.1481636017560959,
0.006577047053724527,
0.00008495114889228716,
0.05231690779328346,
-0.12354825437068939,
0.00801808200776577,
0.12795209884643555,
0.16216930747032166,
-0.06782492995262146,
0.030572455376386642,
0.006686080247163773,
0.009476084262132645,
0.0265547726303339,
0.081295907497406,
0.04755833372473717,
0.20882102847099304,
-0.056795634329319,
0.06742533296346664,
0.017301959916949272,
-0.01072106696665287,
-0.02867433987557888,
0.10879624634981155,
-0.03288119286298752,
0.027480898424983025,
-0.08716341853141785,
0.054833557456731796,
-0.07404323667287827,
-0.2526516616344452,
0.02857344225049019,
-0.027154281735420227,
-0.11631323397159576,
0.03837230056524277,
-0.04813612252473831,
-0.020923804491758347,
0.055927205830812454,
0.01580522023141384,
0.004464758560061455,
0.12163231521844864,
0.02972228266298771,
-0.01720847375690937,
-0.023340579122304916,
0.09960614889860153,
-0.024374525994062424,
0.20008379220962524,
-0.0054012141190469265,
0.04969330132007599,
0.0865035206079483,
0.037101563066244125,
-0.12465301156044006,
0.009704111143946648,
0.038711853325366974,
-0.06782444566488266,
-0.0028241523541510105,
0.1943519115447998,
-0.015764612704515457,
0.023792440071702003,
0.07377663254737854,
0.01247415877878666,
0.030504221096634865,
-0.07143756747245789,
0.005398368928581476,
-0.11993397027254105,
0.00227960548363626,
-0.07444971799850464,
0.10502578318119049,
0.17941179871559143,
-0.0594450905919075,
0.03202008083462715,
-0.051622603088617325,
0.0003355676308274269,
0.013786176219582558,
0.02335280366241932,
-0.018529077991843224,
-0.0818585753440857,
0.01158786378800869,
0.10593373328447342,
0.04532989487051964,
-0.17336872220039368,
-0.08123491704463959,
0.03525387495756149,
-0.0695778876543045,
-0.015838632360100746,
0.13691790401935577,
0.0020430227741599083,
0.07124923914670944,
-0.027948645874857903,
-0.04102720320224762,
-0.021465905010700226,
0.06528740376234055,
-0.14099033176898956,
-0.06424172967672348
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.