File size: 882 Bytes
51f592c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f484afd
51f592c
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
---
license: apache-2.0
datasets:
- MathGenie/MathCode-Pile
language:
- en
metrics:
- accuracy
base_model:
- codellama/CodeLlama-7b-hf
pipeline_tag: text-generation
tags:
- math
---

# MathCoder2

### Introduction

The MathCoder2 models are created by conducting continued pretraining on [MathCode-Pile](https://huggingface.co/datasets/MathGenie/MathCode-Pile). They are introduced in the paper [MathCoder2: Better Math Reasoning from Continued Pretraining on Model-translated Mathematical Code](https://arxiv.org/abs/2410.08196).

The mathematical pretraining dataset includes mathematical code accompanied with natural language reasoning steps, making it a superior resource for models aimed at performing advanced mathematical reasoning tasks.

### Evaluation

![image/png](https://cdn-uploads.huggingface.co/production/uploads/65dd9e7b4a4fce1ec96dc6b7/BEZoDZLjp-fPFlt7oFXBa.png)