Text Generation
Transformers
Safetensors
English
Chinese
bailing_moe
code
Mixture of Experts
conversational
custom_code
twelveand0 commited on
Commit
0c94323
·
verified ·
1 Parent(s): 236edf9

Update the technique report citation

Browse files
Files changed (1) hide show
  1. README.md +12 -1
README.md CHANGED
@@ -98,4 +98,15 @@ Please refer to [Github](https://github.com/inclusionAI/Ling/blob/master/README.
98
  This code repository is licensed under [the MIT License](https://huggingface.co/inclusionAI/Ling-Coder-lite/blob/main/LICENCE).
99
 
100
  ## Citation
101
- [TBD]
 
 
 
 
 
 
 
 
 
 
 
 
98
  This code repository is licensed under [the MIT License](https://huggingface.co/inclusionAI/Ling-Coder-lite/blob/main/LICENCE).
99
 
100
  ## Citation
101
+
102
+ ```
103
+ @misc{codefuse2025samplemattersleveragingmixtureofexperts,
104
+ title={Every Sample Matters: Leveraging Mixture-of-Experts and High-Quality Data for Efficient and Accurate Code LLM},
105
+ author={Codefuse and Ling Team and : and Wenting Cai and Yuchen Cao and Chaoyu Chen and Chen Chen and Siba Chen and Qing Cui and Peng Di and Junpeng Fang and Zi Gong and Ting Guo and Zhengyu He and Yang Huang and Cong Li and Jianguo Li and Zheng Li and Shijie Lian and BingChang Liu and Songshan Luo and Shuo Mao and Min Shen and Jian Wu and Jiaolong Yang and Wenjie Yang and Tong Ye and Hang Yu and Wei Zhang and Zhenduo Zhang and Hailin Zhao and Xunjin Zheng and Jun Zhou},
106
+ year={2025},
107
+ eprint={2503.17793},
108
+ archivePrefix={arXiv},
109
+ primaryClass={cs.LG},
110
+ url={https://arxiv.org/abs/2503.17793},
111
+ }
112
+ ```