mini1013's picture
Push model using huggingface_hub.
755495b verified
---
base_model: mini1013/master_domain
library_name: setfit
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- setfit
- sentence-transformers
- text-classification
- generated_from_setfit_trainer
widget:
- text: 시카 클리닉 비듬제거 두피 샴푸 1000ml (#M)뷰티>헤어/바디/미용기기>헤어케어>샴푸 CJmall > 뷰티 > 헤어/바디/미용기기
> 헤어케어 > 샴푸
- text: 더바디샵 진저 샴푸 모발 관리 400ML 3 MinSellAmount (#M)바디/헤어>헤어케어>샴푸/린스 Gmarket > 뷰티
> 바디/헤어 > 헤어케어 > 샴푸/린스
- text: 리엔 자윤 모근강화 지성 샴푸 500ml × 2 (#M)쿠팡 홈>생활용품>헤어/바디/세안>샴푸/린스>샴푸>한방샴푸 Coupang >
뷰티 > 헤어 > 샴푸 > 한방샴푸
- text: '[댄트롤] 딥 클린 박하 솔트 샴푸 820ml 딥 클린 박하 솔트 샴푸 820ml (#M)홈>화장품/미용>헤어케어>샴푸 Naverstore
> 화장품/미용 > 헤어케어 > 샴푸'
- text: 쿤달 클렌징 지성샴푸 500ml ★신향★일랑일랑 (#M)홈>헤어>샴푸 Naverstore > 화장품/미용 > 헤어케어 > 샴푸
inference: true
model-index:
- name: SetFit with mini1013/master_domain
results:
- task:
type: text-classification
name: Text Classification
dataset:
name: Unknown
type: unknown
split: test
metrics:
- type: accuracy
value: 0.8367556468172485
name: Accuracy
---
# SetFit with mini1013/master_domain
This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [mini1013/master_domain](https://huggingface.co/mini1013/master_domain) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification.
The model has been trained using an efficient few-shot learning technique that involves:
1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning.
2. Training a classification head with features from the fine-tuned Sentence Transformer.
## Model Details
### Model Description
- **Model Type:** SetFit
- **Sentence Transformer body:** [mini1013/master_domain](https://huggingface.co/mini1013/master_domain)
- **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance
- **Maximum Sequence Length:** 512 tokens
- **Number of Classes:** 4 classes
<!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->
### Model Sources
- **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit)
- **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055)
- **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit)
### Model Labels
| Label | Examples |
|:------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
| 3 | <ul><li>'트리플에스 대용량 약산성 탈모샴푸 1350ml/세트구성 탈모샴푸 580ml+580ml+무료증정(5ml 10개) 쇼킹딜 홈>뷰티>헤어>샴푸/린스/기능성;11st>뷰티>헤어>샴푸/린스/기능성;(#M)11st>헤어케어>샴푸>일반 11st > 뷰티 > 헤어케어 > 샴푸'</li><li>'닥터방기원 랩샴푸 탈모샴푸 1L x 3개 (#M)헤어케어>샴푸>샴푸바 AD > 11st > 뷰티 > 헤어케어 > 샴푸 > 샴푸바'</li><li>'[메디올]탈모완화 우디향 샴푸/두피청정 퓨리파잉 샴푸/트리트먼트/헤어케어 15.퓨리파잉 샴푸 480ml 2개_+블루퓨리파잉샴푸 100ml 1개+시트 트먼 50ml 1개 (#M)헤어케어>샴푸>샴푸바 11st Hour Event > 패션/뷰티 > 뷰티 > 헤어 > 샴푸/린스/기능성'</li></ul> |
| 0 | <ul><li>'[본사직영] 떡진머리 드라이 파우더 (#M)위메프 > 생활·주방용품 > 바디/헤어 > 바디로션/핸드/풋 > 바디보습 위메프 > 뷰티 > 바디/헤어 > 바디로션/핸드/풋 > 바디보습'</li><li>'[코랩][3개세트] 코랩 비건 헤어 드라이샴푸 200ml (6종 택1, 교차선택 가능) 파라다이스_프레쉬_트로피컬 (#M)11st>헤어케어>샴푸>일반 11st > 뷰티 > 헤어케어 > 샴푸'</li><li>'르네휘테르 나뚜리아 인비저블 드라이 샴푸 200ml (#M)화장품/미용>헤어케어>샴푸 AD > traverse > Naverstore > 화장품/미용 > 헤어케어 > 샴푸 > 드라이샴푸'</li></ul> |
| 2 | <ul><li>'미쟝센 퍼펙트세럼 샴푸/컨디셔너 680ml 2입 모음 09__슈퍼리치 샴푸1입+컨디셔너1입 ssg > 뷰티 > 헤어/바디 > 헤어케어 > 헤어트리트먼트;ssg > 뷰티 > 헤어/바디 > 헤어케어 > 샴푸;ssg > 뷰티 > 헤어/바디 > 헤어케어 ssg > 뷰티 > 헤어/바디 > 헤어케어 > 린스/컨디셔너'</li><li>'[대용량 퍼퓸] 수오가닉 대용량 약산성 아로마 퍼퓸 샴푸워시 1000ml 5개 옵션 5개 선택 해주세요_샴푸워시 오스만투스 1000ml (#M)화장품/미용>헤어케어>샴푸 AD > Naverstore > 화장품/미용 > 헤어케어 > 샴푸 > 약산성샴푸'</li><li>'발샴푸 300ml 공식수입정품 발냄새 발전용 (#M)SSG.COM/헤어/바디/바디케어/풋케어 ssg > 뷰티 > 헤어/바디 > 바디케어 > 풋케어'</li></ul> |
| 1 | <ul><li>'삼쩜오 저탄소 샴푸바만들기 (교육용) 100g 1개분량 샴푸바 키트 1인 키트 파란색_레몬그라스 (#M)화장품/미용>헤어케어>샴푸 AD > traverse > Naverstore > 화장품/미용 > 헤어케어 > 샴푸 > 샴푸바'</li><li>'오디샤 저자극 약산성 천연 다시마추출물 샴푸바 더퓨어 120g (#M)화장품/미용>헤어케어>샴푸 Naverstore > 화장품/미용 > 헤어케어 > 샴푸 > 샴푸바'</li><li>'솝퓨리 커스텀 세트 노세범 샴푸바_안티로스 샴푸바_네버드라이 페이셜&바디바 (#M)화장품/미용>헤어케어>샴푸 AD > Naverstore > 화장품/미용 > 헤어케어 > 샴푸 > 샴푸바'</li></ul> |
## Evaluation
### Metrics
| Label | Accuracy |
|:--------|:---------|
| **all** | 0.8368 |
## Uses
### Direct Use for Inference
First install the SetFit library:
```bash
pip install setfit
```
Then you can load this model and run inference.
```python
from setfit import SetFitModel
# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_top_bt13_3_test_flat")
# Run inference
preds = model("쿤달 딥 클렌징 지성샴푸 500ml ★신향★일랑일랑 (#M)홈>헤어>샴푸 Naverstore > 화장품/미용 > 헤어케어 > 샴푸")
```
<!--
### Downstream Use
*List how someone could finetune this model on their own dataset.*
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Set Metrics
| Training set | Min | Median | Max |
|:-------------|:----|:-------|:----|
| Word count | 13 | 21.665 | 44 |
| Label | Training Sample Count |
|:------|:----------------------|
| 0 | 50 |
| 1 | 50 |
| 2 | 50 |
| 3 | 50 |
### Training Hyperparameters
- batch_size: (64, 64)
- num_epochs: (30, 30)
- max_steps: -1
- sampling_strategy: oversampling
- num_iterations: 100
- body_learning_rate: (2e-05, 1e-05)
- head_learning_rate: 0.01
- loss: CosineSimilarityLoss
- distance_metric: cosine_distance
- margin: 0.25
- end_to_end: False
- use_amp: False
- warmup_proportion: 0.1
- l2_weight: 0.01
- seed: 42
- eval_max_steps: -1
- load_best_model_at_end: False
### Training Results
| Epoch | Step | Training Loss | Validation Loss |
|:-------:|:----:|:-------------:|:---------------:|
| 0.0032 | 1 | 0.4592 | - |
| 0.1597 | 50 | 0.3966 | - |
| 0.3195 | 100 | 0.3419 | - |
| 0.4792 | 150 | 0.2777 | - |
| 0.6390 | 200 | 0.2014 | - |
| 0.7987 | 250 | 0.1159 | - |
| 0.9585 | 300 | 0.06 | - |
| 1.1182 | 350 | 0.0152 | - |
| 1.2780 | 400 | 0.0032 | - |
| 1.4377 | 450 | 0.0016 | - |
| 1.5974 | 500 | 0.0009 | - |
| 1.7572 | 550 | 0.0005 | - |
| 1.9169 | 600 | 0.0004 | - |
| 2.0767 | 650 | 0.0002 | - |
| 2.2364 | 700 | 0.0002 | - |
| 2.3962 | 750 | 0.0001 | - |
| 2.5559 | 800 | 0.0001 | - |
| 2.7157 | 850 | 0.0001 | - |
| 2.8754 | 900 | 0.0001 | - |
| 3.0351 | 950 | 0.0 | - |
| 3.1949 | 1000 | 0.0 | - |
| 3.3546 | 1050 | 0.0 | - |
| 3.5144 | 1100 | 0.0 | - |
| 3.6741 | 1150 | 0.0 | - |
| 3.8339 | 1200 | 0.0 | - |
| 3.9936 | 1250 | 0.0 | - |
| 4.1534 | 1300 | 0.0 | - |
| 4.3131 | 1350 | 0.0 | - |
| 4.4728 | 1400 | 0.0 | - |
| 4.6326 | 1450 | 0.0 | - |
| 4.7923 | 1500 | 0.0 | - |
| 4.9521 | 1550 | 0.0 | - |
| 5.1118 | 1600 | 0.0 | - |
| 5.2716 | 1650 | 0.0 | - |
| 5.4313 | 1700 | 0.0 | - |
| 5.5911 | 1750 | 0.0 | - |
| 5.7508 | 1800 | 0.0 | - |
| 5.9105 | 1850 | 0.0 | - |
| 6.0703 | 1900 | 0.0 | - |
| 6.2300 | 1950 | 0.0 | - |
| 6.3898 | 2000 | 0.0 | - |
| 6.5495 | 2050 | 0.0 | - |
| 6.7093 | 2100 | 0.0 | - |
| 6.8690 | 2150 | 0.0 | - |
| 7.0288 | 2200 | 0.0 | - |
| 7.1885 | 2250 | 0.0 | - |
| 7.3482 | 2300 | 0.0 | - |
| 7.5080 | 2350 | 0.0 | - |
| 7.6677 | 2400 | 0.0 | - |
| 7.8275 | 2450 | 0.0 | - |
| 7.9872 | 2500 | 0.0 | - |
| 8.1470 | 2550 | 0.0 | - |
| 8.3067 | 2600 | 0.0 | - |
| 8.4665 | 2650 | 0.0 | - |
| 8.6262 | 2700 | 0.0 | - |
| 8.7859 | 2750 | 0.0 | - |
| 8.9457 | 2800 | 0.0 | - |
| 9.1054 | 2850 | 0.0 | - |
| 9.2652 | 2900 | 0.0 | - |
| 9.4249 | 2950 | 0.0 | - |
| 9.5847 | 3000 | 0.0 | - |
| 9.7444 | 3050 | 0.0 | - |
| 9.9042 | 3100 | 0.0 | - |
| 10.0639 | 3150 | 0.0 | - |
| 10.2236 | 3200 | 0.0 | - |
| 10.3834 | 3250 | 0.0 | - |
| 10.5431 | 3300 | 0.0 | - |
| 10.7029 | 3350 | 0.0 | - |
| 10.8626 | 3400 | 0.0 | - |
| 11.0224 | 3450 | 0.0 | - |
| 11.1821 | 3500 | 0.0 | - |
| 11.3419 | 3550 | 0.0 | - |
| 11.5016 | 3600 | 0.0 | - |
| 11.6613 | 3650 | 0.0 | - |
| 11.8211 | 3700 | 0.0 | - |
| 11.9808 | 3750 | 0.0 | - |
| 12.1406 | 3800 | 0.0 | - |
| 12.3003 | 3850 | 0.0 | - |
| 12.4601 | 3900 | 0.0 | - |
| 12.6198 | 3950 | 0.0 | - |
| 12.7796 | 4000 | 0.0017 | - |
| 12.9393 | 4050 | 0.0052 | - |
| 13.0990 | 4100 | 0.0005 | - |
| 13.2588 | 4150 | 0.0 | - |
| 13.4185 | 4200 | 0.0 | - |
| 13.5783 | 4250 | 0.0 | - |
| 13.7380 | 4300 | 0.0002 | - |
| 13.8978 | 4350 | 0.0 | - |
| 14.0575 | 4400 | 0.0 | - |
| 14.2173 | 4450 | 0.0 | - |
| 14.3770 | 4500 | 0.0 | - |
| 14.5367 | 4550 | 0.0 | - |
| 14.6965 | 4600 | 0.0 | - |
| 14.8562 | 4650 | 0.0 | - |
| 15.0160 | 4700 | 0.0 | - |
| 15.1757 | 4750 | 0.0 | - |
| 15.3355 | 4800 | 0.0 | - |
| 15.4952 | 4850 | 0.0 | - |
| 15.6550 | 4900 | 0.0 | - |
| 15.8147 | 4950 | 0.0 | - |
| 15.9744 | 5000 | 0.0 | - |
| 16.1342 | 5050 | 0.0 | - |
| 16.2939 | 5100 | 0.0 | - |
| 16.4537 | 5150 | 0.0 | - |
| 16.6134 | 5200 | 0.0 | - |
| 16.7732 | 5250 | 0.0 | - |
| 16.9329 | 5300 | 0.0 | - |
| 17.0927 | 5350 | 0.0 | - |
| 17.2524 | 5400 | 0.0 | - |
| 17.4121 | 5450 | 0.0 | - |
| 17.5719 | 5500 | 0.0 | - |
| 17.7316 | 5550 | 0.0 | - |
| 17.8914 | 5600 | 0.0 | - |
| 18.0511 | 5650 | 0.0 | - |
| 18.2109 | 5700 | 0.0 | - |
| 18.3706 | 5750 | 0.0 | - |
| 18.5304 | 5800 | 0.0 | - |
| 18.6901 | 5850 | 0.0 | - |
| 18.8498 | 5900 | 0.0 | - |
| 19.0096 | 5950 | 0.0 | - |
| 19.1693 | 6000 | 0.0 | - |
| 19.3291 | 6050 | 0.0 | - |
| 19.4888 | 6100 | 0.0 | - |
| 19.6486 | 6150 | 0.0 | - |
| 19.8083 | 6200 | 0.0 | - |
| 19.9681 | 6250 | 0.0 | - |
| 20.1278 | 6300 | 0.0 | - |
| 20.2875 | 6350 | 0.0 | - |
| 20.4473 | 6400 | 0.0 | - |
| 20.6070 | 6450 | 0.0 | - |
| 20.7668 | 6500 | 0.0 | - |
| 20.9265 | 6550 | 0.0 | - |
| 21.0863 | 6600 | 0.0 | - |
| 21.2460 | 6650 | 0.0 | - |
| 21.4058 | 6700 | 0.0 | - |
| 21.5655 | 6750 | 0.0 | - |
| 21.7252 | 6800 | 0.0 | - |
| 21.8850 | 6850 | 0.0 | - |
| 22.0447 | 6900 | 0.0 | - |
| 22.2045 | 6950 | 0.0 | - |
| 22.3642 | 7000 | 0.0 | - |
| 22.5240 | 7050 | 0.0 | - |
| 22.6837 | 7100 | 0.0 | - |
| 22.8435 | 7150 | 0.0 | - |
| 23.0032 | 7200 | 0.0 | - |
| 23.1629 | 7250 | 0.0 | - |
| 23.3227 | 7300 | 0.0 | - |
| 23.4824 | 7350 | 0.0 | - |
| 23.6422 | 7400 | 0.0 | - |
| 23.8019 | 7450 | 0.0 | - |
| 23.9617 | 7500 | 0.0 | - |
| 24.1214 | 7550 | 0.0 | - |
| 24.2812 | 7600 | 0.0 | - |
| 24.4409 | 7650 | 0.0 | - |
| 24.6006 | 7700 | 0.0 | - |
| 24.7604 | 7750 | 0.0 | - |
| 24.9201 | 7800 | 0.0 | - |
| 25.0799 | 7850 | 0.0 | - |
| 25.2396 | 7900 | 0.0 | - |
| 25.3994 | 7950 | 0.0 | - |
| 25.5591 | 8000 | 0.0 | - |
| 25.7188 | 8050 | 0.0 | - |
| 25.8786 | 8100 | 0.0 | - |
| 26.0383 | 8150 | 0.0 | - |
| 26.1981 | 8200 | 0.0 | - |
| 26.3578 | 8250 | 0.0 | - |
| 26.5176 | 8300 | 0.0 | - |
| 26.6773 | 8350 | 0.0 | - |
| 26.8371 | 8400 | 0.0 | - |
| 26.9968 | 8450 | 0.0 | - |
| 27.1565 | 8500 | 0.0 | - |
| 27.3163 | 8550 | 0.0 | - |
| 27.4760 | 8600 | 0.0 | - |
| 27.6358 | 8650 | 0.0 | - |
| 27.7955 | 8700 | 0.0 | - |
| 27.9553 | 8750 | 0.0 | - |
| 28.1150 | 8800 | 0.0 | - |
| 28.2748 | 8850 | 0.0 | - |
| 28.4345 | 8900 | 0.0 | - |
| 28.5942 | 8950 | 0.0 | - |
| 28.7540 | 9000 | 0.0 | - |
| 28.9137 | 9050 | 0.0 | - |
| 29.0735 | 9100 | 0.0001 | - |
| 29.2332 | 9150 | 0.0 | - |
| 29.3930 | 9200 | 0.0 | - |
| 29.5527 | 9250 | 0.0 | - |
| 29.7125 | 9300 | 0.0 | - |
| 29.8722 | 9350 | 0.0 | - |
### Framework Versions
- Python: 3.10.12
- SetFit: 1.1.0
- Sentence Transformers: 3.3.1
- Transformers: 4.44.2
- PyTorch: 2.2.0a0+81ea7a4
- Datasets: 3.2.0
- Tokenizers: 0.19.1
## Citation
### BibTeX
```bibtex
@article{https://doi.org/10.48550/arxiv.2209.11055,
doi = {10.48550/ARXIV.2209.11055},
url = {https://arxiv.org/abs/2209.11055},
author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
title = {Efficient Few-Shot Learning Without Prompts},
publisher = {arXiv},
year = {2022},
copyright = {Creative Commons Attribution 4.0 International}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->