File size: 904 Bytes
bfa2138
 
 
 
 
 
 
 
5ec9c1c
bfa2138
 
 
ae72341
8d65448
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
---
license: apache-2.0
datasets:
- open-r1/OpenR1-Math-220k
language:
- en
base_model:
- allenai/OLMo-2-0325-32B-Instruct
library_name: transformers
tags:
- reasoning
- math
pipeline_tag: text-generation
---

TNG Technology Consulting fine-tuned the 32-billion-parameter OLMo-2 Large Language Model using AMD's MI300X GPUs and the Open R1 dataset, focusing on enhancing the model's reasoning capabilities. The MI300X accelerators, with their multi-chip module architecture and substantial memory bandwidth, facilitated efficient handling of the model's training requirements. The Open R1 dataset, curated by Hugging Face, provided a comprehensive collection of mathematical problems with detailed reasoning traces, serving as an ideal foundation for this fine-tuning endeavor. This collaborative effort underscores the potential of open-source initiatives and advanced hardware in advancing AI research.