
LocalDoc/azerbaijani_toxicity_classifier
Text Classification
•
0.3B
•
Updated
•
27
•
1
⚠️ CONTENT WARNING ⚠️
This dataset contains highly offensive and toxic content in Azerbaijani language, including:
🔞 This dataset is intended for mature audiences (18+) and research purposes only.
This dataset contains Azerbaijani text comments labeled with toxicity scores across 7 categories:
Each comment is scored 0.0 (not toxic) or 1.0 (highly toxic) for each category.
The views and content expressed in this dataset do not reflect the opinions of the dataset creators or affiliated institutions. This material is provided solely for research purposes to advance the field of AI safety and content moderation.
For more information, questions, or issues, please contact LocalDoc at [[email protected]].