loubnabnl's picture
loubnabnl HF staff
update
431bf32
|
raw
history blame
679 Bytes

CodeParrot uses GPT-2 architecture with BPE tokenizer trained on Python code. We released this model as an educational tool for training large language models from scratch on code, with detailed tutorials and descriptions of the training process. It makes use of Accelerate for distributed training and mixed precision. See this blog and repo for more details.

Model # parameters
GPT2 1.5B