YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Overview
JABER (Junior Arabic BERt) is a 12-layer Arabic pretrained Language Model.
JABER obtained rank one on ALUE leaderboard at 01/09/2021
.
This model is only compatible with the code in this github repo (not supported by the Transformers library)
Citation
Please cite the following paper when using our code and model:
@misc{ghaddar2021jaber,
title={JABER: Junior Arabic BERt},
author={Abbas Ghaddar and Yimeng Wu and Ahmad Rashid and Khalil Bibi and Mehdi Rezagholizadeh and Chao Xing and Yasheng Wang and Duan Xinyu and Zhefeng Wang and Baoxing Huai and Xin Jiang and Qun Liu and Philippe Langlais},
year={2021},
eprint={2112.04329},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.