Upload folder using huggingface_hub
Browse files
README.md
CHANGED
@@ -1,8 +1,18 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
4 |
-
|
5 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
6 |
|
7 |
ChronoBERT is a series **high-performance chronologically consistent large language models (LLM)** designed to eliminate lookahead bias and training leakage while maintain good language understanding in time-sensitive applications. The model is pretrained on **diverse, high-quality, open-source, and timestamped text** to maintain chronological consistency.
|
8 |
|
@@ -13,7 +23,7 @@ All models in the series achieve **GLUE benchmark scores that surpass standard B
|
|
13 |
- **Language(s) (NLP):** English
|
14 |
- **License:** MIT License
|
15 |
|
16 |
-
|
17 |
|
18 |
- **Paper:** "Chronologically Consistent Large Language Models" (He, Lv, Manela, Wu, 2025)
|
19 |
|
|
|
1 |
+
---
|
2 |
+
library_name: transformers
|
3 |
+
license: mit
|
4 |
+
language:
|
5 |
+
- en
|
6 |
+
tags:
|
7 |
+
- chronologically consistent
|
8 |
+
- modernbert
|
9 |
+
- glue
|
10 |
+
pipeline_tag: fill-mask
|
11 |
+
inference: false
|
12 |
+
---
|
13 |
+
# ChronoBERT
|
14 |
+
|
15 |
+
## Model Description
|
16 |
|
17 |
ChronoBERT is a series **high-performance chronologically consistent large language models (LLM)** designed to eliminate lookahead bias and training leakage while maintain good language understanding in time-sensitive applications. The model is pretrained on **diverse, high-quality, open-source, and timestamped text** to maintain chronological consistency.
|
18 |
|
|
|
23 |
- **Language(s) (NLP):** English
|
24 |
- **License:** MIT License
|
25 |
|
26 |
+
## Model Sources
|
27 |
|
28 |
- **Paper:** "Chronologically Consistent Large Language Models" (He, Lv, Manela, Wu, 2025)
|
29 |
|