RichardErkhov commited on
Commit
ad88b96
·
verified ·
1 Parent(s): d5c242d

uploaded readme

Browse files
Files changed (1) hide show
  1. README.md +137 -0
README.md ADDED
@@ -0,0 +1,137 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Quantization made by Richard Erkhov.
2
+
3
+ [Github](https://github.com/RichardErkhov)
4
+
5
+ [Discord](https://discord.gg/pvy7H8DZMG)
6
+
7
+ [Request more models](https://github.com/RichardErkhov/quant_request)
8
+
9
+
10
+ mbart-large-50-many-to-many-mmt - bnb 8bits
11
+ - Model creator: https://huggingface.co/facebook/
12
+ - Original model: https://huggingface.co/facebook/mbart-large-50-many-to-many-mmt/
13
+
14
+
15
+
16
+
17
+ Original model description:
18
+ ---
19
+ language:
20
+ - multilingual
21
+ - ar
22
+ - cs
23
+ - de
24
+ - en
25
+ - es
26
+ - et
27
+ - fi
28
+ - fr
29
+ - gu
30
+ - hi
31
+ - it
32
+ - ja
33
+ - kk
34
+ - ko
35
+ - lt
36
+ - lv
37
+ - my
38
+ - ne
39
+ - nl
40
+ - ro
41
+ - ru
42
+ - si
43
+ - tr
44
+ - vi
45
+ - zh
46
+ - af
47
+ - az
48
+ - bn
49
+ - fa
50
+ - he
51
+ - hr
52
+ - id
53
+ - ka
54
+ - km
55
+ - mk
56
+ - ml
57
+ - mn
58
+ - mr
59
+ - pl
60
+ - ps
61
+ - pt
62
+ - sv
63
+ - sw
64
+ - ta
65
+ - te
66
+ - th
67
+ - tl
68
+ - uk
69
+ - ur
70
+ - xh
71
+ - gl
72
+ - sl
73
+ tags:
74
+ - mbart-50
75
+ pipeline_tag: translation
76
+ ---
77
+
78
+ # mBART-50 many to many multilingual machine translation
79
+
80
+
81
+ This model is a fine-tuned checkpoint of [mBART-large-50](https://huggingface.co/facebook/mbart-large-50). `mbart-large-50-many-to-many-mmt` is fine-tuned for multilingual machine translation. It was introduced in [Multilingual Translation with Extensible Multilingual Pretraining and Finetuning](https://arxiv.org/abs/2008.00401) paper.
82
+
83
+
84
+ The model can translate directly between any pair of 50 languages. To translate into a target language, the target language id is forced as the first generated token. To force the target language id as the first generated token, pass the `forced_bos_token_id` parameter to the `generate` method.
85
+
86
+
87
+ ```python
88
+ from transformers import MBartForConditionalGeneration, MBart50TokenizerFast
89
+
90
+ article_hi = "संयुक्त राष्ट्र के प्रमुख का कहना है कि सीरिया में कोई सैन्य समाधान नहीं है"
91
+ article_ar = "الأمين العام للأمم المتحدة يقول إنه لا يوجد حل عسكري في سوريا."
92
+
93
+ model = MBartForConditionalGeneration.from_pretrained("facebook/mbart-large-50-many-to-many-mmt")
94
+ tokenizer = MBart50TokenizerFast.from_pretrained("facebook/mbart-large-50-many-to-many-mmt")
95
+
96
+ # translate Hindi to French
97
+ tokenizer.src_lang = "hi_IN"
98
+ encoded_hi = tokenizer(article_hi, return_tensors="pt")
99
+ generated_tokens = model.generate(
100
+ **encoded_hi,
101
+ forced_bos_token_id=tokenizer.lang_code_to_id["fr_XX"]
102
+ )
103
+ tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
104
+ # => "Le chef de l 'ONU affirme qu 'il n 'y a pas de solution militaire dans la Syrie."
105
+
106
+ # translate Arabic to English
107
+ tokenizer.src_lang = "ar_AR"
108
+ encoded_ar = tokenizer(article_ar, return_tensors="pt")
109
+ generated_tokens = model.generate(
110
+ **encoded_ar,
111
+ forced_bos_token_id=tokenizer.lang_code_to_id["en_XX"]
112
+ )
113
+ tokenizer.batch_decode(generated_tokens, skip_special_tokens=True)
114
+ # => "The Secretary-General of the United Nations says there is no military solution in Syria."
115
+ ```
116
+
117
+
118
+ See the [model hub](https://huggingface.co/models?filter=mbart-50) to look for more fine-tuned versions.
119
+
120
+
121
+ ## Languages covered
122
+ Arabic (ar_AR), Czech (cs_CZ), German (de_DE), English (en_XX), Spanish (es_XX), Estonian (et_EE), Finnish (fi_FI), French (fr_XX), Gujarati (gu_IN), Hindi (hi_IN), Italian (it_IT), Japanese (ja_XX), Kazakh (kk_KZ), Korean (ko_KR), Lithuanian (lt_LT), Latvian (lv_LV), Burmese (my_MM), Nepali (ne_NP), Dutch (nl_XX), Romanian (ro_RO), Russian (ru_RU), Sinhala (si_LK), Turkish (tr_TR), Vietnamese (vi_VN), Chinese (zh_CN), Afrikaans (af_ZA), Azerbaijani (az_AZ), Bengali (bn_IN), Persian (fa_IR), Hebrew (he_IL), Croatian (hr_HR), Indonesian (id_ID), Georgian (ka_GE), Khmer (km_KH), Macedonian (mk_MK), Malayalam (ml_IN), Mongolian (mn_MN), Marathi (mr_IN), Polish (pl_PL), Pashto (ps_AF), Portuguese (pt_XX), Swedish (sv_SE), Swahili (sw_KE), Tamil (ta_IN), Telugu (te_IN), Thai (th_TH), Tagalog (tl_XX), Ukrainian (uk_UA), Urdu (ur_PK), Xhosa (xh_ZA), Galician (gl_ES), Slovene (sl_SI)
123
+
124
+
125
+ ## BibTeX entry and citation info
126
+ ```
127
+ @article{tang2020multilingual,
128
+ title={Multilingual Translation with Extensible Multilingual Pretraining and Finetuning},
129
+ author={Yuqing Tang and Chau Tran and Xian Li and Peng-Jen Chen and Naman Goyal and Vishrav Chaudhary and Jiatao Gu and Angela Fan},
130
+ year={2020},
131
+ eprint={2008.00401},
132
+ archivePrefix={arXiv},
133
+ primaryClass={cs.CL}
134
+ }
135
+ ```
136
+
137
+