foamliu commited on
Commit
283e6d5
·
verified ·
1 Parent(s): 60ca924

End of training

Browse files
Files changed (2) hide show
  1. README.md +3 -1
  2. config.json +1 -1
README.md CHANGED
@@ -1,8 +1,10 @@
1
  ---
 
2
  library_name: transformers
3
  model_name: Xmodel2-1.2B-Open-R1-GRPO
4
  tags:
5
  - generated_from_trainer
 
6
  - trl
7
  - grpo
8
  licence: license
@@ -10,7 +12,7 @@ licence: license
10
 
11
  # Model Card for Xmodel2-1.2B-Open-R1-GRPO
12
 
13
- This model is a fine-tuned version of [None](https://huggingface.co/None).
14
  It has been trained using [TRL](https://github.com/huggingface/trl).
15
 
16
  ## Quick start
 
1
  ---
2
+ datasets: open-r1/OpenR1-Math-220k
3
  library_name: transformers
4
  model_name: Xmodel2-1.2B-Open-R1-GRPO
5
  tags:
6
  - generated_from_trainer
7
+ - open-r1
8
  - trl
9
  - grpo
10
  licence: license
 
12
 
13
  # Model Card for Xmodel2-1.2B-Open-R1-GRPO
14
 
15
+ This model is a fine-tuned version of [None](https://huggingface.co/None) on the [open-r1/OpenR1-Math-220k](https://huggingface.co/datasets/open-r1/OpenR1-Math-220k) dataset.
16
  It has been trained using [TRL](https://github.com/huggingface/trl).
17
 
18
  ## Quick start
config.json CHANGED
@@ -34,6 +34,6 @@
34
  "scale_emb": 12,
35
  "torch_dtype": "bfloat16",
36
  "transformers_version": "4.49.0",
37
- "use_cache": false,
38
  "vocab_size": 65280
39
  }
 
34
  "scale_emb": 12,
35
  "torch_dtype": "bfloat16",
36
  "transformers_version": "4.49.0",
37
+ "use_cache": true,
38
  "vocab_size": 65280
39
  }