LLaMa-3-8B-First-4-Layers / mergekit_config.yml
chargoddard's picture
Upload folder using huggingface_hub
638ed66 verified
raw
history blame contribute delete
124 Bytes
merge_method: passthrough
dtype: bfloat16
slices:
- sources:
- layer_range: [0, 4]
model: NousResearch/Meta-Llama-3-8B