mansoorhamidzadeh commited on
Commit
6049a7f
·
verified ·
1 Parent(s): aefa11f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +67 -6
README.md CHANGED
@@ -1,12 +1,73 @@
1
  ---
2
- library_name: transformers
3
  license: apache-2.0
 
 
 
 
 
 
 
 
4
  language:
5
  - en
6
  pipeline_tag: text-classification
 
 
7
  ---
8
- This is labels:
9
- label 1 --> 'Household'
10
- label 2 -->'Books'
11
- label 3 --> 'Clothing & Accessories'
12
- label 4 -->'Electronics'
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
 
2
  license: apache-2.0
3
+ base_model: bert-base-uncased
4
+ tags:
5
+ - text-classification
6
+ - bert
7
+ - english
8
+ model-index:
9
+ - name: BERT Classification
10
+ results: []
11
  language:
12
  - en
13
  pipeline_tag: text-classification
14
+ metrics:
15
+ - accuracy
16
  ---
17
+
18
+ # BERT Classification
19
+
20
+ ## Model Overview
21
+
22
+ - **Model Name**: BERT Classification
23
+ - **Model Type**: Text Classification
24
+ - **Developer**: Mansoor Hamidzadeh
25
+ - **Framework**: Transformers
26
+ - **Language**: English
27
+ - **License**: Apache-2.0
28
+
29
+ ## Model Description
30
+
31
+ This model is a fine-tuned BERT (Bidirectional Encoder Representations from Transformers) designed for text classification tasks. It categorizes text into four labels:
32
+
33
+ - **Label 1**: Household
34
+ - **Label 2**: Books
35
+ - **Label 3**: Clothing & Accessories
36
+ - **Label 4**: Electronics
37
+
38
+ ## Technical Details
39
+
40
+ - **Model Size**: 109M parameters
41
+ - **Tensor Type**: F32
42
+ - **File Format**: Safetensors
43
+
44
+ ## How To Use
45
+ ```python
46
+ # Use a pipeline as a high-level helper
47
+ from transformers import pipeline
48
+
49
+ text=''
50
+ pipe = pipeline("text-classification", model="mansoorhamidzadeh/bert_classification")
51
+ pipe(text)
52
+
53
+ ```
54
+ ## Usage
55
+
56
+ The model is useful for categorizing product descriptions or similar text data into predefined labels.
57
+
58
+ ## Performance
59
+
60
+ - **Downloads Last Month**: 4
61
+
62
+ ## Citation
63
+
64
+ If you use this model in your research or applications, please cite it as follows:
65
+
66
+ ```bibtex
67
+ @misc{your_name_2024_mt5_en_fa,
68
+ author = {mansoorhamidzadeh},
69
+ title = {English to Persian Translation using MT5-Small},
70
+ year = {2024},
71
+ publisher = {Hugging Face},
72
+ howpublished = {\url{https://huggingface.co/mansoorhamidzadeh/mt5_en_fa_translation}},
73
+ }