pietrolesci commited on
Commit
1483d4a
·
verified ·
1 Parent(s): 7222701

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -7,9 +7,9 @@ size_categories:
7
  - 100B<n<1T
8
  ---
9
 
10
- This dataset contains the fully prepared data (tokenised and pre-shuffled) used to train the Pythia (deduped) models—which you can find under the EleutherAI organisation and also listed in my [Memerisation-Profiles collection](https://huggingface.co/collections/pietrolesci/memorisation-profiles-6619604c4594c878cd9d451f).
11
- It is the very same data as [EleutherAI/pile-deduped-pythia-preshuffled](https://huggingface.co/datasets/EleutherAI/pile-deduped-pythia-preshuffled) in a more manageable format.
12
- That is, instead of storing the data in the Megatron format used by the GPT-NeoX library, I stored them in a parquet format.
13
 
14
 
15
  ## Format
 
7
  - 100B<n<1T
8
  ---
9
 
10
+ This dataset contains the fully prepared data, which has been tokenized and pre-shuffled, used to train the Pythia (deduplicated) models. You can find these models under the EleutherAI organization, and they are also listed in my [Memorization Profiles collection](https://huggingface.co/collections/pietrolesci/memorisation-profiles-6619604c4594c878cd9d451f).
11
+
12
+ This data is the same as the one found in [EleutherAI/pile-deduped-pythia-preshuffled](https://huggingface.co/datasets/EleutherAI/pile-deduped-pythia-preshuffled), but it is presented in a more manageable format. Instead of using the Megatron format utilized by the GPT-NeoX library, I have stored the data in a parquet format.
13
 
14
 
15
  ## Format