--- base_model: [] library_name: transformers tags: - mergekit - merge --- # EVA-Tissint-v1.2-14B This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). New merge of [EVA v0.2](https://huggingface.co/EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2), now using [Tissint v1.2](https://huggingface.co/Ttimofeyka/Tissint-14B-v1.2-128k-RP). I slightly altered the merge parameters yet again, I wonder how this will go for other people. I recommend the samplers provided on Tissint's model card. ### Merge Method This model was merged using the della_linear merge method using EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2 as a base. ### Models Merged The following models were included in the merge: * Ttimofeyka/Tissint-14B-v1.2-128k-RP ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: Ttimofeyka/Tissint-14B-v1.2-128k-RP parameters: density: 0.45 weight: 0.35 - model: EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2 parameters: density: 0.55 weight: 0.65 merge_method: della_linear base_model: EVA-UNIT-01/EVA-Qwen2.5-14B-v0.2 parameters: epsilon: 0.05 lambda: 1 dtype: bfloat16 ```