Text Generation
Transformers
Safetensors
chatglm
feature-extraction
custom_code
JosephusCheung commited on
Commit
90fea9a
·
verified ·
1 Parent(s): 3c78a6a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -39,7 +39,7 @@ co2_eq_emissions:
39
 
40
  [GGML with ChatGLM.cpp (recommended)](https://huggingface.co/CausalLM/miniG/tree/ggml): https://github.com/li-plus/chatglm.cpp
41
 
42
- [GGUF (Text-Only, not recommended)](https://huggingface.co/CausalLM/miniG/tree/gguf)
43
 
44
  A model trained on a synthesis dataset of over **120 million** entries, this dataset having been generated through the application of state-of-the-art language models utilizing large context windows, alongside methodologies akin to retrieval-augmented generation and knowledge graph integration, where the data synthesis is conducted within clusters derived from a curated pretraining corpus of 20 billion tokens, with subsequent validation performed by the model itself.
45
 
@@ -71,7 +71,7 @@ Despite the absence of thorough alignment with human preferences, the model is u
71
 
72
  [GGML 用于 ChatGLM.cpp (推荐)](https://huggingface.co/CausalLM/miniG/tree/ggml): https://github.com/li-plus/chatglm.cpp
73
 
74
- [GGUF (纯文本,不推荐)](https://huggingface.co/CausalLM/miniG/tree/gguf)
75
 
76
  一个在超过**1.2亿**条数据合成数据集上训练的模型,这些数据集是通过应用具有大上下文窗口的最先进语言模型生成的,并结合了类似于检索增强生成和知识图谱集成的方法,数据合成是在一个由200亿个标记组成的预训练语料库中提取的聚类内进行的,随后由模型本身进行验证。
77
 
 
39
 
40
  [GGML with ChatGLM.cpp (recommended)](https://huggingface.co/CausalLM/miniG/tree/ggml): https://github.com/li-plus/chatglm.cpp
41
 
42
+ [GGUF (Text-Only, not recommended)](https://huggingface.co/CausalLM/miniG/tree/gguf): There is a significant degradation, even with the F16.
43
 
44
  A model trained on a synthesis dataset of over **120 million** entries, this dataset having been generated through the application of state-of-the-art language models utilizing large context windows, alongside methodologies akin to retrieval-augmented generation and knowledge graph integration, where the data synthesis is conducted within clusters derived from a curated pretraining corpus of 20 billion tokens, with subsequent validation performed by the model itself.
45
 
 
71
 
72
  [GGML 用于 ChatGLM.cpp (推荐)](https://huggingface.co/CausalLM/miniG/tree/ggml): https://github.com/li-plus/chatglm.cpp
73
 
74
+ [GGUF (纯文本,不推荐)](https://huggingface.co/CausalLM/miniG/tree/gguf): 即使使用F16,性能也有显著下降。
75
 
76
  一个在超过**1.2亿**条数据合成数据集上训练的模型,这些数据集是通过应用具有大上下文窗口的最先进语言模型生成的,并结合了类似于检索增强生成和知识图谱集成的方法,数据合成是在一个由200亿个标记组成的预训练语料库中提取的聚类内进行的,随后由模型本身进行验证。
77