gpt2_rope_prevlayer_qk / config.json
camillebrl's picture
Upload folder using huggingface_hub
39683ca verified
raw
history blame contribute delete
99 Bytes
{"vocab_size": 32768, "n_embed": 384, "n_head": 6, "n_layer": 6, "block_size": 256, "dropout": 0.2}