Update README.md
Browse files
README.md
CHANGED
@@ -1,31 +1,29 @@
|
|
1 |
-
|
2 |
-
base_model:
|
3 |
-
- meta-llama/Meta-Llama-3-8B-Instruct
|
4 |
-
library_name: transformers
|
5 |
-
tags:
|
6 |
-
- mergekit
|
7 |
-
- merge
|
8 |
|
9 |
-
|
10 |
-
|
11 |
|
12 |
-
|
|
|
13 |
|
14 |
-
## Merge Details
|
15 |
-
### Merge Method
|
16 |
|
17 |
-
|
18 |
-
|
19 |
-
### Models Merged
|
20 |
-
|
21 |
-
The following models were included in the merge:
|
22 |
* [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct)
|
23 |
|
24 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
25 |
|
26 |
-
|
27 |
|
28 |
-
```
|
29 |
slices:
|
30 |
- sources:
|
31 |
- model: meta-llama/Meta-Llama-3-8B-Instruct
|
@@ -35,4 +33,4 @@ slices:
|
|
35 |
layer_range: [7, 31]
|
36 |
merge_method: passthrough
|
37 |
dtype: bfloat16
|
38 |
-
```
|
|
|
1 |
+
# Aura-llama
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
|
3 |
+

|
4 |
+
Now that the cute anime girl has your attention.
|
5 |
|
6 |
+
Aura-llama is using the methodology presented by SOLAR for scaling LLMs called depth up-scaling (DUS), which encompasses architectural modifications with continued pretraining.
|
7 |
+
Using the solar paper as a base, I integrated Llama-3 weights into the upscaled layers, and In the future plan to continue training the model.
|
8 |
|
|
|
|
|
9 |
|
10 |
+
Aura-llama is a merge of the following models to create a base model to work from:
|
11 |
+
* [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct)
|
|
|
|
|
|
|
12 |
* [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct)
|
13 |
|
14 |
+
## Merged Evals: (Has Not Been Finetuned)
|
15 |
+
Aura-llama
|
16 |
+
* Avg: ?
|
17 |
+
* ARC: ?
|
18 |
+
* HellaSwag: ?
|
19 |
+
* MMLU: ?
|
20 |
+
* T-QA: ?
|
21 |
+
* Winogrande: ?
|
22 |
+
* GSM8K: ?
|
23 |
|
24 |
+
## 🧩 Configuration
|
25 |
|
26 |
+
```
|
27 |
slices:
|
28 |
- sources:
|
29 |
- model: meta-llama/Meta-Llama-3-8B-Instruct
|
|
|
33 |
layer_range: [7, 31]
|
34 |
merge_method: passthrough
|
35 |
dtype: bfloat16
|
36 |
+
```
|