Safetensors
llama

SelfCite: Self-Supervised Alignment for Context Attribution in Large Language Models

Paper: https://arxiv.org/abs/2502.09604
Authors: Yung-Sung Chuang†, Benjamin Cohen-Wang†, Shannon Zejiang Shen†, Zhaofeng Wu†, Hu Xu‡, Xi Victoria Lin‡, James Glass†, Shang-Wen Li‡, Wen-tau Yih‡
† Massachusetts Institute of Technology, ‡ Meta AI

main-fig

This model is a reproduction of the SimPO fine-tuned model from a Llama-3.1-8B-Instruct model, which was first trained with SFT data from ContextCite (256 calls). This is the fully self-supervised setting experiment in our paper. Please refer to our GitHub repository for usage and more details: https://github.com/voidism/SelfCite

Citation

Please cite our paper as well as LongCite if they are helpful to your work!

@inproceedings{chuang2025selfcite,
  title={SelfCite: Self-Supervised Alignment for Context Attribution in Large Language Models},
  author={Yung-Sung Chuang and Benjamin Cohen-Wang and Shannon Zejiang Shen and Zhaofeng Wu and Hu Xu and Xi Victoria Lin and James Glass and Shang-Wen Li and Wen-tau Yih},
  journal={arXiv preprint arXiv:25xx.xxxxx},
  year={2025}
}

@article{zhang2024longcite,
  title = {LongCite: Enabling LLMs to Generate Fine-grained Citations in Long-context QA} 
  author={Jiajie Zhang and Yushi Bai and Xin Lv and Wanjun Gu and Danqing Liu and Minhao Zou and Shulin Cao and Lei Hou and Yuxiao Dong and Ling Feng and Juanzi Li},
  journal={arXiv preprint arXiv:2409.02897},
  year={2024}
}
Downloads last month
91
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for voidism/SelfCite-8B-from-CC

Quantizations
1 model