File size: 5,030 Bytes
41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 2eed456 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 41064f3 e87c364 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 |
pipeline_tag: text-generation
license: apache-2.0
tags:
- text generation
- Deci AI
- DeciCoder
programming_language:
- Java
- JavaScript
- Python
- Rust
- Ruby
- C++
- C
- C#
metrics:
- code_eval
inference: true
widget:
- text: 'def print_hello_world():'
example_title: Hello world
group: Python
model-index:
- name: DeciCoder-6b
results:
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Python)
metrics:
- name: pass@1
type: pass@1
value: 0.34
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (JavaScript)
metrics:
- name: pass@1
type: pass@1
value: 0.29
verified: false
- task:
type: text-generation
dataset:
type: nuprl/MultiPL-E
name: MultiPL-HumanEval (Java)
metrics:
- name: pass@1
type: pass@1
value: 0.30
verified: false
datasets:
- bigcode/starcoderdata
---
# Model Card for DeciCoder-6B
DeciCoder-6B is a 6 billion parameter decoder-only code completion model
trained on the Python, Java, Javascript, Rust, C++, C, and C# subset of [Starcoder Training Dataset](https://huggingface.co/datasets/bigcode/starcoderdata).
The model uses variable Grouped Query Attention and has a context window of 4096
tokens. It was trained using a Fill-in-the-Middle training objective. The model's
architecture was generated by Deci's proprietary Neural Architecture
Search-based technology, AutoNAC.
## Model Details
- **Developed by:** Deci
- **Model type:** DeciCoder-6B is an auto-regressive language model based on the transformer decoder architecture, using variable Grouped Query Attention.
- **Language(s):** Python, Java, JavaScript, Ruby, Rust, C++, C, C#
- **License:** Model checkpoints are licensed under the [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0)
## Documentation
- Google Colab [Notebook](https://colab.research.google.com/drive/1ZxG9qMlom9vn4lSGlD8PrjwHBvag94ei?usp=sharing)
- Blog Post: [Introducing DeciCoder-6B: The Best Multi-Language Code Generation LLM in Its Class](https://deci.ai/blog/decicoder-6b-the-best-multi-language-code-generation-llm-in-its-class/)
- Tutorial: [How to Run DeciCoder-6B on Qualcomm AI 100](https://github.com/quic/cloud-ai-sdk/tree/1.12/models/language_processing/decoder)
- Questions: Feel free to contact us via our [Discord Community!](https://discord.com/invite/p9ecgRhDR8/)
## Model Architecture
| Parameters | Layers | Heads | Sequence Length | GQA num_key_value_heads |
|:----------|:----------|:----------|:----------|:----------|
| 6B | 32 | 32 | 4096 | Variable |
- **Decoder layer:** Variable Grouped Query Attention
- **Position Embeddings:** Rotary Position Embeddings [Su et al., 2021](https://arxiv.org/abs/2104.09864)
### How to Use
```bibtex
# pip install -q transformers
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
checkpoint = "Deci/DeciCoder-6b"
device = "cuda" # for GPU usage or "cpu" for CPU usage
tokenizer = AutoTokenizer.from_pretrained(checkpoint)
model = AutoModelForCausalLM.from_pretrained(checkpoint, torch_dtype=torch.bfloat16, trust_remote_code=True).to(device)
inputs = tokenizer.encode("def print_hello_world():", return_tensors="pt").to(device)
outputs = model.generate(inputs, max_new_tokens=100)
print(tokenizer.decode(outputs[0]))
### Attribution
DeciCoder-6B was trained on StarCoder Training Dataset, filtered for
Python, Java, JavaScript, Ruby, RUST, C++, C, and C#. For additional information, please
refer to [https://huggingface.co/datasets/bigcode/starcoderdata](https://huggingface.co/datasets/bigcode/starcoderdata).
```
### Limitations
The model has undergone training with source code from Python, Java,
JavaScript, Ruby, RUST, C++, C, and C#. While the primary language in the source is English, it does
contain other languages. Therefore, the model can produce code snippets
given some context. However, there is no assurance that the resulting
code will function as expected. It might be suboptimal, contain bugs, or
even exploits.
## Evaluation
Below are DeciCoder-6B's pass@1 on MultiPL HumanEval scores
| Python | JavaScript | Java | C++ | C# | Rust | Go |
|:----------|:----------|:----------|:----------|:----------|:----------|:----------|
| 33.3% | 29.3% | 30.3% |29.93% |20.31% |20.5% |77.47% |
### Runtime Benchmarks
|Inference Tool | Hardware | Prompt Length | Generation Length | Throughput (tokens/sec) |
|:----------|:----------|:----------|:----------|:----------|
| Qualcomm SDK | Qualcomm AI 100 | 1024 | 1024 | 531.3 |
- Measured for maximal batch size on the device
## How to Cite
Please cite this model using this format.
```bibtex
@misc{DeciFoundationModels,
title = {DeciCoder-6B},
author = {DeciAI Research Team},
year = {2023}
url={[https://huggingface.co/deci/decicoder-6B](https://huggingface.co/deci/decicoder-6B)},
}
``` |