LaciaStudio commited on
Commit
853b8e7
·
verified ·
1 Parent(s): a4ef65a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -21,6 +21,9 @@ tags:
21
  - toxic-language-detection
22
  - sentiment-analysis
23
  ---
 
 
 
24
  # Model Card for Model ID
25
 
26
  This model is designed to classify whether a given text contains offensive language or not. It was trained on a set of words labeled as either "normal" or "offensive." The model is capable of distinguishing between these two categories with high accuracy.
@@ -48,7 +51,7 @@ Then, there are three more dense layers with progressively decreasing numbers of
48
  Finally, there is a dense layer with a single neuron at the output, using the sigmoid activation function to make a binary decision (0 — non-offensive, 1 — offensive word).
49
  The model is trained using the RMSprop optimizer with a low learning rate (0.0001), allowing it to train smoothly with minimal fluctuations. The binary cross-entropy loss function is used, which is ideal for binary classification tasks where the goal is to evaluate the probability of belonging to one of two categories.
50
 
51
- - **Developed by:** LaciaStudio/LaciaAI
52
  - **Model type:** text-classification
53
  - **Language(s) (NLP):** Russian, English
54
  - **License:** cc-by-nc-4.0
 
21
  - toxic-language-detection
22
  - sentiment-analysis
23
  ---
24
+
25
+ ![Official Luna Logo](https://huggingface.co/username/repo-name/resolve/main/my-image.png)
26
+
27
  # Model Card for Model ID
28
 
29
  This model is designed to classify whether a given text contains offensive language or not. It was trained on a set of words labeled as either "normal" or "offensive." The model is capable of distinguishing between these two categories with high accuracy.
 
51
  Finally, there is a dense layer with a single neuron at the output, using the sigmoid activation function to make a binary decision (0 — non-offensive, 1 — offensive word).
52
  The model is trained using the RMSprop optimizer with a low learning rate (0.0001), allowing it to train smoothly with minimal fluctuations. The binary cross-entropy loss function is used, which is ideal for binary classification tasks where the goal is to evaluate the probability of belonging to one of two categories.
53
 
54
+ - **Developed by:** LaciaStudio
55
  - **Model type:** text-classification
56
  - **Language(s) (NLP):** Russian, English
57
  - **License:** cc-by-nc-4.0