--- base_model: - Sao10K/70B-L3.3-Cirrus-x1 - TheDrummer/Anubis-70B-v1 - Sao10K/L3.3-70B-Euryale-v2.3 - SicariusSicariiStuff/Negative_LLAMA_70B library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [Linear DELLA](https://arxiv.org/abs/2406.11617) merge method using [Sao10K/L3.3-70B-Euryale-v2.3](https://huggingface.co/Sao10K/L3.3-70B-Euryale-v2.3) as a base. ### Models Merged The following models were included in the merge: * [Sao10K/70B-L3.3-Cirrus-x1](https://huggingface.co/Sao10K/70B-L3.3-Cirrus-x1) * [TheDrummer/Anubis-70B-v1](https://huggingface.co/TheDrummer/Anubis-70B-v1) * [SicariusSicariiStuff/Negative_LLAMA_70B](https://huggingface.co/SicariusSicariiStuff/Negative_LLAMA_70B) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: della_linear dtype: bfloat16 parameters: normalize: true int8_mask: true tokenizer_source: base base_model: Sao10K/L3.3-70B-Euryale-v2.3 models: - model: Sao10K/70B-L3.3-Cirrus-x1 parameters: density: 0.55 weight: 1 - model: SicariusSicariiStuff/Negative_LLAMA_70B parameters: density: 0.55 weight: 1 - model: TheDrummer/Anubis-70B-v1 parameters: density: 0.55 weight: 1 ```