mrqorib commited on
Commit
cc795f8
·
verified ·
1 Parent(s): 4884979

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +33 -3
README.md CHANGED
@@ -1,3 +1,33 @@
1
- ---
2
- license: cc-by-sa-4.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ tags:
4
+ - moece
5
+ - grammar
6
+ - grammatical
7
+ - grammaticality
8
+ - gec
9
+ license: cc-by-sa-4.0
10
+ base_model:
11
+ - google/t5-v1_1-base
12
+ - google/t5-v1_1-large
13
+ ---
14
+
15
+ # MoECE: Mixture of Error Correction Experts
16
+ MoECE is a grammatical error correction model built by converting T5-v1.1 models into mixture-of-expert models. MoECE is more computationally efficient than the original T5 models and produces interpretable corrections by identifying the error types for each correction token.
17
+
18
+ The safety warning is caused by the checkpoints being saved in pickle format. These checkpoints were generated using the Fairseq library and are not directly compatible with the Transformers library. Please refer to the [official repository](https://github.com/nusnlp/moece) for instructions on how to use the models.
19
+
20
+ MoECE was introduced in the following paper ([PDF](https://mrqorib.github.io/assets/pdf/MoECE.pdf)):
21
+ ```latex
22
+ @inproceedings{qorib-etal-2024-efficient,
23
+ title = "Efficient and Interpretable Grammatical Error Correction with Mixture of Experts",
24
+ author = "Qorib, Muhammad Reza and
25
+ Aji, Alham Fikri and
26
+ Ng, Hwee Tou",
27
+ booktitle = "Findings of the 2024 Conference on Empirical Methods in Natural Language Processing",
28
+ month = nov,
29
+ year = "2024",
30
+ address = "Miami",
31
+ publisher = "Association for Computational Linguistics",
32
+ }
33
+ ```