reaperdoesntknow commited on
Commit
51c60f4
·
verified ·
1 Parent(s): 5ac2bdb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -8
README.md CHANGED
@@ -28,8 +28,8 @@ pipeline_tag: text-generation
28
 
29
  # Model Card for SmolLM2_Thinks
30
 
31
- This model is a fine-tuned version of [None](https://huggingface.co/prithivMLmods/SmolLM2-CoT-360M).
32
- It has been trained using [TRL](https://github.com/huggingface/trl).
33
 
34
  ## Quick start
35
 
@@ -37,17 +37,16 @@ It has been trained using [TRL](https://github.com/huggingface/trl).
37
  from transformers import pipeline
38
 
39
  question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
40
- generator = pipeline("text-generation", model="reaperdoesntknow/SmolLM2_Thinks", device="cuda")
41
  output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
42
  print(output["generated_text"])
43
- ```
44
-
45
- ## Training procedure
46
 
47
-
48
 
 
 
 
49
 
50
- This model was trained with SFT.
51
 
52
  ### Framework versions
53
 
@@ -57,6 +56,9 @@ This model was trained with SFT.
57
  - Datasets: 4.0.0
58
  - Tokenizers: 0.22.0
59
 
 
 
 
60
  ## Citations
61
 
62
 
 
28
 
29
  # Model Card for SmolLM2_Thinks
30
 
31
+ This model is a fine-tuned version of [prithivMLmods/SmolLM2-CoT-360M](https://huggingface.co/prithivMLmods/SmolLM2-CoT-360M).
32
+ It has been trained using on multiple rounds of [TRL](https://github.com/huggingface/trl).
33
 
34
  ## Quick start
35
 
 
37
  from transformers import pipeline
38
 
39
  question = "If you had a time machine, but could only go to the past or the future once and never return, which would you choose and why?"
40
+ generator = pipeline("text-generation", model="reaperdoesntknow/SMOLM2Prover", device="cuda")
41
  output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
42
  print(output["generated_text"])
 
 
 
43
 
44
+ from transformers import AutoTokenizer, AutoModelForCausalLM
45
 
46
+ tokenizer = AutoTokenizer.from_pretrained("reaperdoesntknow/SMOLM2Prover")
47
+ model = AutoModelForCausalLM.from_pretrained("reaperdoesntknow/SMOLM2Prover")
48
+ ```
49
 
 
50
 
51
  ### Framework versions
52
 
 
56
  - Datasets: 4.0.0
57
  - Tokenizers: 0.22.0
58
 
59
+ ## Acknowledgements
60
+ - I acknowledge you!
61
+
62
  ## Citations
63
 
64