robinsmits commited on
Commit
5d9aed9
·
verified ·
1 Parent(s): 0c09b2c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -22,6 +22,8 @@ license: apache-2.0
22
 
23
  This continual pretrained model is pretrained on roughly 2.4 Billion tokens of Dutch language data based on Wikipedia and MC4.
24
 
 
 
25
  As a base model the IBM Granite 3.0 2B Instruct model was used.
26
 
27
  See [ibm-granite/granite-3.0-2b-instruct](https://huggingface.co/ibm-granite/granite-3.0-2b-instruct) for all information about the IBM Granite foundation model.
 
22
 
23
  This continual pretrained model is pretrained on roughly 2.4 Billion tokens of Dutch language data based on Wikipedia and MC4.
24
 
25
+ Primary objective with continual pretraining on Dutch was to make the model more 'fluent' when using the Dutch language. It will also have gained some additional Dutch knowledge.
26
+
27
  As a base model the IBM Granite 3.0 2B Instruct model was used.
28
 
29
  See [ibm-granite/granite-3.0-2b-instruct](https://huggingface.co/ibm-granite/granite-3.0-2b-instruct) for all information about the IBM Granite foundation model.