innovation-hacking2 commited on
Commit
bfa2138
·
verified ·
1 Parent(s): 892cb5b

Create README.md

Browse files

We fine-tuned an OLMo 2 32B Large Language Model on the Open R1 dataset using AMD MI300X graphics cards.

Files changed (1) hide show
  1. README.md +13 -0
README.md ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - open-r1/OpenR1-Math-220k
5
+ language:
6
+ - en
7
+ base_model:
8
+ - allenai/OLMo-2-0325-32B-Instruct
9
+ library_name: adapter-transformers
10
+ tags:
11
+ - reasoning
12
+ - math
13
+ ---