cp2024 / README.md
cpayne1303's picture
update readme again
1ae2bec
|
raw
history blame
358 Bytes
metadata
datasets:
  - allenai/c4
language:
  - en
library_name: transformers
license: apache-2.0
base_model:
  - cpayne1303/cp2024

Model Description

This is a model using the llama2 architecture and only 30 million parameters. It is trained on approximately 2 billion tokens of diverse web data from the first 1000000 rows of the uncleaned c4 english dataset.