Quantized GPT2 model.

Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on BookCorpus, a dataset of over 7,000 unpublished fiction books from various genres, and trained on a dataset of 8 million web pages.

Downloads last month
1,586
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train pavfi-at-m/gpt2GPTQ