This is a GPT-2 (124M) model trained in llm.c for 100B tokens with WSD (Warmup-Stable-Decay) learning rate schedule on FineWeb-EDU.

A lot more detailed info and observations are here: https://x.com/Yuchenj_UW/status/1816181774374109250

Downloads last month
136
Safetensors
Model size
124M params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Dataset used to train yuchenj/gpt2_124M_100B_FinewebEdu_wsd_hf