Datasets:

Modalities:
Text
Formats:
parquet
ArXiv:
Libraries:
Datasets
Dask
matthieumeeus97 commited on
Commit
bcde28e
·
verified ·
1 Parent(s): 62d1515

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -3
README.md CHANGED
@@ -32,6 +32,5 @@ To get the members close to the cutoff data, we collect the 13,155 papers publis
32
  We process the raw LateX files using this [script](https://github.com/togethercomputer/RedPajama-Data/blob/rp_v1/data_prep/arxiv/run_clean.py).
33
 
34
  This dataset has been used as source for 'member' documents to develop (document-level) MIAs against LLMs using data collected shortly before (member) and after (non-member) the training cutoff date for the target model ([the suite of OpenLLaMA models](https://huggingface.co/openlm-research/open_llama_7b)).
35
- For more details and results see the section of Regression Discontiuity Design (RDD) in the paper ["SoK: Membership Inference Attacks on LLMs are Rushing Nowhere (and How to Fix It)"](https://arxiv.org/pdf/2406.17975).
36
-
37
- For non-members for the RDD setup, we refer to our [Github repo](https://github.com/computationalprivacy/mia_llms_benchmark/tree/main/document_level).
 
32
  We process the raw LateX files using this [script](https://github.com/togethercomputer/RedPajama-Data/blob/rp_v1/data_prep/arxiv/run_clean.py).
33
 
34
  This dataset has been used as source for 'member' documents to develop (document-level) MIAs against LLMs using data collected shortly before (member) and after (non-member) the training cutoff date for the target model ([the suite of OpenLLaMA models](https://huggingface.co/openlm-research/open_llama_7b)).
35
+ For non-members for the RDD setup, we refer to our [Github repo](https://github.com/computationalprivacy/mia_llms_benchmark/tree/main/document_level).
36
+ For more details and results see the section of Regression Discontiuity Design (RDD) in the paper ["SoK: Membership Inference Attacks on LLMs are Rushing Nowhere (and How to Fix It)"](https://arxiv.org/pdf/2406.17975).