sometimesanotion's picture
Update README.md
d6f43f8 verified
metadata
base_model:
  - Qwen/Qwen2.5-14B
  - sometimesanotion/Qwenvergence-14B-v6-Prose-slerp
library_name: transformers
tags:
  - mergekit
  - merge
license: apache-2.0
language:
  - en
metrics:
  - accuracy
pipeline_tag: text-generation

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the TIES merge method using Qwen/Qwen2.5-14B as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

name:                Qwenvergence-14B-v6-Prose-model_stock
merge_method:        model_stock
base_model:          Qwen/Qwen2.5-14B
tokenizer_source:    huihui-ai/Qwen2.5-14B-Instruct-abliterated-v2
parameters:
  int8_mask:         true
  normalize:         true
  rescale:           false
models:
  - model:           arcee-ai/Virtuoso-Small
  - model:           sometimesanotion/Lamarck-14B-v0.3
  - model:           EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2
  - model:           allura-org/TQ2.5-14B-Sugarquill-v1
  - model:           oxyapi/oxy-1-small
  - model:           v000000/Qwen2.5-Lumen-14B
  - model:           sthenno-com/miscii-14b-1225
  - model:           sthenno-com/miscii-14b-1225
  - model:           underwoods/medius-erebus-magnum-14b
  - model:           huihui-ai/Qwen2.5-14B-Instruct-abliterated-v2
dtype:               float32
out_dtype:           bfloat16
---
# Nifty TIES, LoRA, SLERP involving the listed models
---
name:                Qwenvergence-14B-v6-Prose
merge_method:        ties
base_model:          Qwen/Qwen2.5-14B
tokenizer_source:    base
parameters:         
  density:           1.00
  weight:            1.00
  int8_mask:         true
  normalize:         true
  rescale:           false
dtype:               float32
out_dtype:           bfloat16
models:
  - model:           sometimesanotion/Qwenvergence-14B-v6-Prose-slerp
    parameters:
      density:       1.00
      weight:        1.00