Update README.md
Browse files
README.md
CHANGED
|
@@ -26,6 +26,14 @@ The following models were included in the merge:
|
|
| 26 |
* mlabonne/NeuralBeagle14-7B
|
| 27 |
* HuggingFaceH4/zephyr-7b-alpha
|
| 28 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 29 |
### Configuration
|
| 30 |
|
| 31 |
The following YAML configuration was used to produce this model:
|
|
|
|
| 26 |
* mlabonne/NeuralBeagle14-7B
|
| 27 |
* HuggingFaceH4/zephyr-7b-alpha
|
| 28 |
|
| 29 |
+
### Benchmarks
|
| 30 |
+
|
| 31 |
+
#### Open LLM Leaderboard
|
| 32 |
+
| Model | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K |
|
| 33 |
+
| ------------------------------ | ------- | ---- | --------- | ----- | ---------- | ---------- | ----- |
|
| 34 |
+
| mayacinka/NeuralZephyr-Beagle-7B | 71.57 | 68.6 | 86.38 | 64.67 | 65.17 | 81.14 | 63.46 |
|
| 35 |
+
|
| 36 |
+
|
| 37 |
### Configuration
|
| 38 |
|
| 39 |
The following YAML configuration was used to produce this model:
|