zzzlift
commited on
Commit
·
85c8e9b
1
Parent(s):
86f26b4
create
Browse files
README.md
CHANGED
@@ -2,7 +2,7 @@
|
|
2 |
language: zh
|
3 |
---
|
4 |
|
5 |
-
## bert-chinese-large
|
6 |
|
7 |
This is a Chinese pre-training model BERT, pre-trained on a large-scale corpus. It is suitable for fine-tuning on specific downstream tasks, or as a parameter initialization for pre-training, which can improve performance. Due to excessive alchemy, it is not suitable for Fill Mask directly, unless you have performed a small amount of pre-training.
|
8 |
|
|
|
2 |
language: zh
|
3 |
---
|
4 |
|
5 |
+
## bert-chinese-homie-large
|
6 |
|
7 |
This is a Chinese pre-training model BERT, pre-trained on a large-scale corpus. It is suitable for fine-tuning on specific downstream tasks, or as a parameter initialization for pre-training, which can improve performance. Due to excessive alchemy, it is not suitable for Fill Mask directly, unless you have performed a small amount of pre-training.
|
8 |
|