add LGAI-EXAONE/EXAONE-3.5-32B-Instruct as another baseline?
#3
by
yhg0112
- opened
huge kudos to OLMo team for making OLMO-2-32B-Instruct fully open source.
Any plan to add "LGAI-EXAONE/EXAONE-3.5-32B-Instruct" as another baseline?
We trained "LGAI-EXAONE/EXAONE-3.5-32B-Instruct" with 6.5T tokens for pre-train, which makes ours a fair baseline to OLMO-2-32B-Instruct.
Here's our open-weighted model:
https://huggingface.co/LGAI-EXAONE/EXAONE-3.5-32B-Instruct