license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- lazymergekit | |
- meta-llama/Llama-2-7b-hf | |
- epfl-llm/meditron-7b | |
# without_bio_apart_from_Llama-2-7B-hf | |
without_bio_apart_from_Llama-2-7B-hf is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): | |
* [meta-llama/Llama-2-7b-hf](https://huggingface.co/meta-llama/Llama-2-7b-hf) | |
* [epfl-llm/meditron-7b](https://huggingface.co/epfl-llm/meditron-7b) | |
## 🧩 Configuration | |
```yaml | |
models: | |
- model: meta-llama/Llama-2-7b-hf | |
parameters: | |
weight: 0.5 | |
- model: epfl-llm/meditron-7b | |
parameters: | |
weight: 0.5 | |
merge_method: task_arithmetic | |
base_model: meta-llama/Llama-2-7b-hf | |
parameters: | |
normalize: true | |
int8_mask: true | |
dtype: bfloat16 | |
``` |