File size: 877 Bytes
3fc0f47
 
 
 
a4f57b2
3fc0f47
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
---
license: llama2
---

The LLaMA-2-7b model finetuned on the Math task using [CorDA](https://huggingface.co/papers/2406.05223) in the IPA mode with MetaMath. 

| Method | TriviaQA | NQ open | GSM8k | Math |
|---|---|---|---|---|
|LoRA|44.17|1.91|42.68|5.92|
|[CorDA (KPA with nqopen)](https://huggingface.co/iboing/CorDA_KPA_nqopen_finetuned_math/tree/main) | **45.23** | **10.44** | 45.64 | 6.94|
|[CorDA (IPA with MetaMath)](https://huggingface.co/iboing/CorDA_IPA_math_finetuned_math/tree/main) | - | - | **54.59** | **8.54** |

You can evaluate the model's performance following the step-3 in [CorDA github repo](https://github.com/iboing/CorDA).

Note: The model trained using CorDA adapter is based on customized code. If you want to restore the original LLaMA architecture, execute `merge_adapter_for_corda.py` in [CorDA github repo](https://github.com/iboing/CorDA).