File size: 2,545 Bytes
434b1e9
 
 
 
 
 
ca82899
434b1e9
ca82899
434b1e9
ca82899
434b1e9
ca82899
434b1e9
ca82899
434b1e9
ca82899
434b1e9
ca82899
 
 
 
 
 
 
42aac82
 
 
 
b6a9fef
 
 
 
 
 
42aac82
24dd9e1
 
 
 
42aac82
 
 
 
 
 
 
 
ca82899
 
 
 
 
434b1e9
 
42aac82
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
24dd9e1
42aac82
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
---
dataset_info:
  features:
  - name: comment
    dtype: string
  - name: identity_attack
    dtype: float64
  - name: insult
    dtype: float64
  - name: obscene
    dtype: float64
  - name: severe_toxicity
    dtype: float64
  - name: sexual_explicit
    dtype: float64
  - name: threat
    dtype: float64
  - name: toxicity
    dtype: float64
  splits:
  - name: train
    num_bytes: 9279129
    num_examples: 74780
  download_size: 4110800
  dataset_size: 9279129
task_categories:
- text-classification
language:
- az
tags:
- toxicity
- content-warning
- nsfw
- azerbaijani
- hate-speech-detection
size_categories:
- 10K<n<100K
extra_gated_prompt: >-
  ⚠️ This dataset contains adult content, profanity, and toxic language in
  Azerbaijani. By accessing this dataset, you confirm that you are 18+ and will
  use it responsibly for research purposes only.
extra_gated_fields:
  Company/Organization: text
  Research Purpose: text
  I confirm I am 18+ years old: checkbox
  I will use this dataset responsibly: checkbox
license: cc-by-4.0
pretty_name: Azerbaijani Toxicity Classification Dataset
configs:
- config_name: default
  data_files:
  - split: train
    path: data/train-*
---

# Azerbaijani Toxicity Classification Dataset

⚠️ **CONTENT WARNING** ⚠️

**This dataset contains highly offensive and toxic content in Azerbaijani language, including:**
- Explicit sexual content
- Profanity and vulgar language
- Hate speech and personal attacks
- Threats and harassment
- Other harmful and disturbing material

**🔞 This dataset is intended for mature audiences (18+) and research purposes only.**

## Dataset Description

This dataset contains Azerbaijani text comments labeled with toxicity scores across 7 categories:

- **identity_attack**: Attacks based on identity characteristics
- **insult**: General insults and offensive language  
- **obscene**: Profanity and vulgar content
- **severe_toxicity**: Extremely harmful content
- **sexual_explicit**: Sexual content and references
- **threat**: Threats of violence or harm
- **toxicity**: Overall toxicity score

Each comment is scored 0.0 (not toxic) or 1.0 (highly toxic) for each category.



## Disclaimer

The views and content expressed in this dataset do not reflect the opinions of the dataset creators or affiliated institutions. This material is provided solely for research purposes to advance the field of AI safety and content moderation.

---

## Contact

For more information, questions, or issues, please contact LocalDoc at [[email protected]].