EVA-Tissint-14B / README.md
ockerman0's picture
Upload folder using huggingface_hub
16e6899 verified
|
raw
history blame
1.05 kB
metadata
base_model: []
library_name: transformers
tags:
  - mergekit
  - merge

14b-merge-test-4

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the della_linear merge method using /home/ciaran/Documents/Quantisingmodels/EVA-Qwen2.5-14B-v0.2/ as a base.

Models Merged

The following models were included in the merge:

  • /home/ciaran/Documents/Quantisingmodels/Tissint-14B-128k-RP/

Configuration

The following YAML configuration was used to produce this model:

models:
    - model: /home/ciaran/Documents/Quantisingmodels/Tissint-14B-128k-RP/
      parameters:
        density: 0.4
        weight: 0.3
    - model: /home/ciaran/Documents/Quantisingmodels/EVA-Qwen2.5-14B-v0.2/
      parameters:
        density: 0.6
        weight: 0.7

merge_method: della_linear
base_model: /home/ciaran/Documents/Quantisingmodels/EVA-Qwen2.5-14B-v0.2/
parameters:
    epsilon: 0.05
    lambda: 1
dtype: bfloat16