Datasets:
adding MMLU results
Browse files- results.csv +3 -2
results.csv
CHANGED
@@ -1,3 +1,4 @@
|
|
|
|
1 |
Model (CoT),TheoremQA,MATH,GSM,GPQA,MMLU-STEM
|
2 |
[Mistral-7B-v0.2-base](https://huggingface.co/TIGER-Lab/Mistral-7B-Base-V0.2),19.2,10.2,36.2,24.7,50.1
|
3 |
[Mixtral-7x8B-base](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1),23.2,22.1,58.4,27.2,59.7
|
@@ -23,5 +24,5 @@ GPT-4-turbo-0409,52.4,69.2,94.5,46.2,76.5
|
|
23 |
[WizardMath-7B-1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1),11.7,33,83.2,28.7,52.7
|
24 |
[MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B),16.5,28.2,77.7,30.8,51.3
|
25 |
[Abel-7B-002](https://huggingface.co/GAIR/Abel-7B-002),19.3,29.5,83.2,30.3,29.7
|
26 |
-
[OpenMath-Mistral-7B](https://huggingface.co/
|
27 |
-
[Rho-1-Math-7B](https://huggingface.co/microsoft/rho-math-7b-v0.1),21.0,31.0,66.9,29.2,53.1
|
|
|
1 |
+
<<<<<<< HEAD
|
2 |
Model (CoT),TheoremQA,MATH,GSM,GPQA,MMLU-STEM
|
3 |
[Mistral-7B-v0.2-base](https://huggingface.co/TIGER-Lab/Mistral-7B-Base-V0.2),19.2,10.2,36.2,24.7,50.1
|
4 |
[Mixtral-7x8B-base](https://huggingface.co/mistralai/Mixtral-8x7B-v0.1),23.2,22.1,58.4,27.2,59.7
|
|
|
24 |
[WizardMath-7B-1.1](https://huggingface.co/WizardLM/WizardMath-7B-V1.1),11.7,33,83.2,28.7,52.7
|
25 |
[MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B),16.5,28.2,77.7,30.8,51.3
|
26 |
[Abel-7B-002](https://huggingface.co/GAIR/Abel-7B-002),19.3,29.5,83.2,30.3,29.7
|
27 |
+
[OpenMath-Mistral-7B](https://huggingface.co/nvidia/OpenMath-Mistral-7B-v0.1),13.1,9.1,24.5,26.5,43.7
|
28 |
+
[Rho-1-Math-7B](https://huggingface.co/microsoft/rho-math-7b-v0.1),21.0,31.0,66.9,29.2,53.1
|