Update README.md
Browse files
README.md
CHANGED
@@ -60,9 +60,9 @@ Please note that the above evaluations were run with the newer version of lighte
|
|
60 |
|
61 |
| | Winogrande (5-shot) | Belebele (5-shot) | HellaSwag (10-shot) | ARC-Challenge (25-shot) | TruthfulQA MC2 (0-shot) | MMLU (5-shot) | Average |
|
62 |
|----------------|----------------|-------------|--------------|------------------|-------------------|---------|---------|
|
63 |
-
| Meltemi 7B v1.5 | 73.4% | 77.7% | 79.6% | 54.1% |
|
64 |
-
| Llama-3.1-8B | 74.6
|
65 |
-
| Llama-Krikri-8B | 72.6% | 79.8
|
66 |
|
67 |
# Ethical Considerations
|
68 |
|
|
|
60 |
|
61 |
| | Winogrande (5-shot) | Belebele (5-shot) | HellaSwag (10-shot) | ARC-Challenge (25-shot) | TruthfulQA MC2 (0-shot) | MMLU (5-shot) | Average |
|
62 |
|----------------|----------------|-------------|--------------|------------------|-------------------|---------|---------|
|
63 |
+
| Meltemi 7B v1.5 | 73.4% | 77.7% | 79.6% | 54.1% | 40.5% | 56.9% | 63.7% |
|
64 |
+
| Llama-3.1-8B | **74.6%** | 71.5% | **82.0%** | **58.5%** | 44.2% | **66.2%** | 66.2% |
|
65 |
+
| Llama-Krikri-8B | 72.6% | **79.8%** | 80.7% | 57.8% | **44.8%** | 65.1% | **67.0%** |
|
66 |
|
67 |
# Ethical Considerations
|
68 |
|