File size: 735 Bytes
293ab0a |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
---
datasets:
- davanstrien/aart-ai-safety-dataset
- obalcells/advbench
- databricks/databricks-dolly-15k
---
# Malicious & Jailbreaking Prompt Classifer
# Datasets Used
[MaliciousInstruct](https://github.com/Princeton-SysML/Jailbreak_LLM/blob/main/data/MaliciousInstruct.txt)
[AART](https://github.com/google-research-datasets/aart-ai-safety-dataset/blob/main/aart-v1-20231117.csv)
[StrongREJECT](https://github.com/alexandrasouly/strongreject/blob/main/strongreject_dataset/strongreject_dataset.csv)
[DAN](https://github.com/verazuo/jailbreak_llms/tree/main/data)
[AdvBench](https://github.com/llm-attacks/llm-attacks/tree/main/data/advbench)
[Databricks-Dolly](https://huggingface.co/datasets/databricks/databricks-dolly-15k) |