MS_a-coolyte-2409-22B / mergekit_config.yml
Kaoeiri's picture
Upload folder using huggingface_hub
8278439 verified
raw
history blame contribute delete
348 Bytes
models:
- model: unsloth/Mistral-Small-Instruct-2409+rAIfle/Acolyte-LORA
parameters:
weight: 1
density: 1 # Adjusted density to target a training loss of 0.3
base_model: unsloth/Mistral-Small-Instruct-2409+rAIfle/Acolyte-LORA
merge_method: task_arithmetic
parameters:
normalize: false
dtype: bfloat16
tokenizer_source: union