AIGCodeGeek-DS-6.7B / README.md
Leon-Leee's picture
Update README.md
cbafc7a verified
|
raw
history blame
2.42 kB
metadata
license: apache-2.0
datasets:
  - Leon-Leee/wizardlm_evol_instruct_v2_196K_backuped
  - m-a-p/Code-Feedback
  - openbmb/UltraInteract_sft
  - ise-uiuc/Magicoder-Evol-Instruct-110K
language:
  - en
metrics:
  - code_eval
library_name: transformers
tags:
  - code

AIGCodeGeek-DS-6.7B

Introduction

AIGCodeGeek-DS-6.7B is the first version of our Code-LLM family with competitive performance on benchmarks such as HumanEval(+) and MBPP(+).

It gains a lot of insights from the open-source community and we deeply appreciate all of these great works.

We are preparing for the tech report, so stay tuned for more details:)

Model Details

Model Description

Training data

A mixture of both

  • samples from several high-quality open-source datasets (read Acknowledgements),
  • our private datasets (already decontaminated with benchmarks).

Evaluation

To check out our evaluation results: EvalPlus

Requirements

It should work with the same requirements as DeepSeek-Coder-6.7B or the following packages:

tokenizers>=0.14.0
transformers>=4.35.0
accelerate
sympy>=1.12
pebble 
timeout-decorator 
attrdict

QuickStart

TBD


Limits

Acknowledgements