SetFit with klue/roberta-base

This is a SetFit model that can be used for Text Classification. This SetFit model uses klue/roberta-base as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification.

The model has been trained using an efficient few-shot learning technique that involves:

  1. Fine-tuning a Sentence Transformer with contrastive learning.
  2. Training a classification head with features from the fine-tuned Sentence Transformer.

Model Details

Model Description

  • Model Type: SetFit
  • Sentence Transformer body: klue/roberta-base
  • Classification head: a LogisticRegression instance
  • Maximum Sequence Length: 512 tokens
  • Number of Classes: 5 classes

Model Sources

Model Labels

Label Examples
2
  • '에스쁘아 프로테일러 비글로우 스틱 파운데이션 13g 23호베이지 (#M)홈>화장품/미용>베이스메이크업>파운데이션>스틱형 Naverstore > 화장품/미용 > 베이스메이크업 > 파운데이션 > 스틱형'
  • '그라펜 에어커버 스틱 파운데이션 23호 베이지 LotteOn > 뷰티 > 메이크업 > 베이스메이크업 > 파운데이션 LotteOn > 뷰티 > 메이크업 > 베이스메이크업 > 파운데이션'
  • '바비 브라운 스킨 파운데이션 스틱-2.5 원 샌드 9g (#M)화장품/미용>베이스메이크업>파운데이션>크림형 Naverstore > 화장품/미용 > 베이스메이크업 > 파운데이션 > 크림형'
1
  • '정샘물 스킨 세팅 톤 코렉팅 베이스 40ml 글로잉 베이스 (#M)11st>메이크업>페이스메이크업>메이크업베이스 11st > 뷰티 > 메이크업 > 페이스메이크업 > 메이크업베이스'
  • '아이오페 퍼펙트 커버 메이크업베이스 35ml 2호 라이트퍼플 × 3개 (#M)쿠팡 홈>뷰티>메이크업>베이스 메이크업>베이스/프라이머 Coupang > 뷰티 > 메이크업 > 베이스 메이크업 > 베이스/프라이머'
  • '아이오페 퍼펙트 커버 베이스 35ml 2호-퍼플 (#M)홈>화장품/미용>베이스메이크업>메이크업베이스 Naverstore > 화장품/미용 > 베이스메이크업 > 메이크업베이스'
0
  • '헤라 글로우 래스팅 파운데이션 17C1 페탈 아이보리 LotteOn > 뷰티 > 메이크업 > 베이스메이크업 > 베이스/프라이머 LotteOn > 뷰티 > 메이크업 > 베이스메이크업 > 베이스/프라이머'
  • '[에스티 로더] 더블웨어 파운데이션 30ml SPF 10/PA++ (+프라이머 정품 ) 1W0 웜 포슬린 홈>기획 세트;홈>더블웨어;홈>더블 웨어;화장품/미용>베이스메이크업>파운데이션>리퀴드형;(#M)홈>전체상품 Naverstore > 베이스메이크업 > 파운데이션'
  • '에스쁘아 프로테일러 파운데이션 비 글로우 10ml 4호 베이지 × 1개 (#M)쿠팡 홈>뷰티>메이크업>베이스 메이크업>파운데이션 Coupang > 뷰티 > 로드샵 > 메이크업 > 베이스 메이크업 > 파운데이션'
4
  • '시세이도 스포츠 커버 파운데이션 20g S101 (#M)홈>화장품/미용>베이스메이크업>파운데이션>크림형 Naverstore > 화장품/미용 > 베이스메이크업 > 파운데이션 > 크림형'
  • '시세이도 스포츠 커버 파운데이션 20g S100 × 1개 Coupang > 뷰티 > 메이크업 > 베이스 메이크업 > 파운데이션;(#M)쿠팡 홈>뷰티>메이크업>베이스 메이크업>파운데이션 Coupang > 뷰티 > 메이크업 > 베이스 메이크업 > 파운데이션'
  • '에이지투웨니스 오리지날 샤이닝드롭 케이스+리필3개 (+커피쿠폰+폼20ml) 샤이닝드롭(화이트)23호케이스+리필3개_폼20ml (#M)화장품/미용>베이스메이크업>파운데이션>쿠션형 AD > Naverstore > 화장품/미용 > 베이스메이크업 > 파운데이션 > 크림형'
3
  • '매트 벨벳 스킨 컴팩트 스폰지 단품없음 LotteOn > 뷰티 > 뷰티기기 > 액세서리/소모품 LotteOn > 뷰티 > 뷰티기기 > 액세서리/소모품'
  • '[BF적립] 엉크르 드 뽀 쿠션&리필 세트(+스탠딩 미러+5천LPOINT) 20호_15호 LOREAL > DepartmentLotteOn > 입생로랑 > Branded > 입생로랑 LOREAL > DepartmentLotteOn > 입생로랑 > Branded > 입생로랑'
  • '코튼 LotteOn > 뷰티 > 뷰티기기 > 액세서리/소모품 LotteOn > 뷰티 > 뷰티기기 > 액세서리/소모품'

Evaluation

Metrics

Label Accuracy
all 0.9475

Uses

Direct Use for Inference

First install the SetFit library:

pip install setfit

Then you can load this model and run inference.

from setfit import SetFitModel

# Download from the 🤗 Hub
model = SetFitModel.from_pretrained("mini1013/master_cate_top_bt5_4")
# Run inference
preds = model("[시세이도] NEW 싱크로 스킨 래디언트 리프팅 파운데이션 SPF30/PA++++ 30ml 130 오팔 (#M)홈>메이크업>베이스메이크업 HMALL > 뷰티 > 메이크업 > 베이스메이크업")

Training Details

Training Set Metrics

Training set Min Median Max
Word count 12 22.928 52
Label Training Sample Count
0 50
1 50
2 50
3 50
4 50

Training Hyperparameters

  • batch_size: (64, 64)
  • num_epochs: (30, 30)
  • max_steps: -1
  • sampling_strategy: oversampling
  • num_iterations: 100
  • body_learning_rate: (2e-05, 1e-05)
  • head_learning_rate: 0.01
  • loss: CosineSimilarityLoss
  • distance_metric: cosine_distance
  • margin: 0.25
  • end_to_end: False
  • use_amp: False
  • warmup_proportion: 0.1
  • l2_weight: 0.01
  • seed: 42
  • eval_max_steps: -1
  • load_best_model_at_end: False

Training Results

Epoch Step Training Loss Validation Loss
0.0026 1 0.521 -
0.1279 50 0.4636 -
0.2558 100 0.42 -
0.3836 150 0.292 -
0.5115 200 0.1539 -
0.6394 250 0.0626 -
0.7673 300 0.0343 -
0.8951 350 0.0071 -
1.0230 400 0.0023 -
1.1509 450 0.0005 -
1.2788 500 0.0006 -
1.4066 550 0.0003 -
1.5345 600 0.0002 -
1.6624 650 0.0001 -
1.7903 700 0.0002 -
1.9182 750 0.0006 -
2.0460 800 0.0002 -
2.1739 850 0.0001 -
2.3018 900 0.0 -
2.4297 950 0.0 -
2.5575 1000 0.0 -
2.6854 1050 0.0 -
2.8133 1100 0.0 -
2.9412 1150 0.0 -
3.0691 1200 0.0 -
3.1969 1250 0.0 -
3.3248 1300 0.0 -
3.4527 1350 0.0007 -
3.5806 1400 0.0005 -
3.7084 1450 0.0009 -
3.8363 1500 0.0008 -
3.9642 1550 0.0003 -
4.0921 1600 0.0002 -
4.2199 1650 0.0 -
4.3478 1700 0.0 -
4.4757 1750 0.0 -
4.6036 1800 0.0 -
4.7315 1850 0.0 -
4.8593 1900 0.0 -
4.9872 1950 0.0 -
5.1151 2000 0.0 -
5.2430 2050 0.0 -
5.3708 2100 0.0 -
5.4987 2150 0.0 -
5.6266 2200 0.0 -
5.7545 2250 0.0 -
5.8824 2300 0.0 -
6.0102 2350 0.0001 -
6.1381 2400 0.0006 -
6.2660 2450 0.0 -
6.3939 2500 0.0 -
6.5217 2550 0.0 -
6.6496 2600 0.0 -
6.7775 2650 0.0 -
6.9054 2700 0.0 -
7.0332 2750 0.0 -
7.1611 2800 0.0 -
7.2890 2850 0.0 -
7.4169 2900 0.0 -
7.5448 2950 0.0 -
7.6726 3000 0.0 -
7.8005 3050 0.0 -
7.9284 3100 0.0 -
8.0563 3150 0.0 -
8.1841 3200 0.0 -
8.3120 3250 0.0 -
8.4399 3300 0.0 -
8.5678 3350 0.0 -
8.6957 3400 0.0 -
8.8235 3450 0.0 -
8.9514 3500 0.0 -
9.0793 3550 0.0 -
9.2072 3600 0.0 -
9.3350 3650 0.0 -
9.4629 3700 0.0 -
9.5908 3750 0.0 -
9.7187 3800 0.0 -
9.8465 3850 0.0 -
9.9744 3900 0.0 -
10.1023 3950 0.0 -
10.2302 4000 0.0 -
10.3581 4050 0.0 -
10.4859 4100 0.0 -
10.6138 4150 0.0 -
10.7417 4200 0.0 -
10.8696 4250 0.0 -
10.9974 4300 0.0 -
11.1253 4350 0.0 -
11.2532 4400 0.0 -
11.3811 4450 0.0 -
11.5090 4500 0.0 -
11.6368 4550 0.0 -
11.7647 4600 0.0 -
11.8926 4650 0.0 -
12.0205 4700 0.0 -
12.1483 4750 0.0 -
12.2762 4800 0.0 -
12.4041 4850 0.0 -
12.5320 4900 0.0 -
12.6598 4950 0.0 -
12.7877 5000 0.0 -
12.9156 5050 0.0 -
13.0435 5100 0.0 -
13.1714 5150 0.0 -
13.2992 5200 0.0 -
13.4271 5250 0.0 -
13.5550 5300 0.0 -
13.6829 5350 0.0 -
13.8107 5400 0.0 -
13.9386 5450 0.0 -
14.0665 5500 0.0 -
14.1944 5550 0.0 -
14.3223 5600 0.0 -
14.4501 5650 0.0 -
14.5780 5700 0.0 -
14.7059 5750 0.0 -
14.8338 5800 0.0 -
14.9616 5850 0.0 -
15.0895 5900 0.0 -
15.2174 5950 0.0 -
15.3453 6000 0.0 -
15.4731 6050 0.0 -
15.6010 6100 0.0 -
15.7289 6150 0.0 -
15.8568 6200 0.0 -
15.9847 6250 0.0 -
16.1125 6300 0.0 -
16.2404 6350 0.0 -
16.3683 6400 0.0 -
16.4962 6450 0.0 -
16.6240 6500 0.0 -
16.7519 6550 0.0 -
16.8798 6600 0.0 -
17.0077 6650 0.0 -
17.1355 6700 0.0 -
17.2634 6750 0.0 -
17.3913 6800 0.0 -
17.5192 6850 0.0 -
17.6471 6900 0.0 -
17.7749 6950 0.0 -
17.9028 7000 0.0 -
18.0307 7050 0.0 -
18.1586 7100 0.0 -
18.2864 7150 0.0 -
18.4143 7200 0.0 -
18.5422 7250 0.0 -
18.6701 7300 0.0 -
18.7980 7350 0.0 -
18.9258 7400 0.0 -
19.0537 7450 0.0 -
19.1816 7500 0.0 -
19.3095 7550 0.0004 -
19.4373 7600 0.0028 -
19.5652 7650 0.0003 -
19.6931 7700 0.0002 -
19.8210 7750 0.0 -
19.9488 7800 0.0 -
20.0767 7850 0.0 -
20.2046 7900 0.0 -
20.3325 7950 0.0 -
20.4604 8000 0.0 -
20.5882 8050 0.0 -
20.7161 8100 0.0 -
20.8440 8150 0.0 -
20.9719 8200 0.0 -
21.0997 8250 0.0 -
21.2276 8300 0.0 -
21.3555 8350 0.0 -
21.4834 8400 0.0 -
21.6113 8450 0.0 -
21.7391 8500 0.0 -
21.8670 8550 0.0 -
21.9949 8600 0.0 -
22.1228 8650 0.0 -
22.2506 8700 0.0 -
22.3785 8750 0.0 -
22.5064 8800 0.0 -
22.6343 8850 0.0 -
22.7621 8900 0.0 -
22.8900 8950 0.0 -
23.0179 9000 0.0 -
23.1458 9050 0.0 -
23.2737 9100 0.0 -
23.4015 9150 0.0 -
23.5294 9200 0.0 -
23.6573 9250 0.0 -
23.7852 9300 0.0 -
23.9130 9350 0.0 -
24.0409 9400 0.0 -
24.1688 9450 0.0 -
24.2967 9500 0.0 -
24.4246 9550 0.0 -
24.5524 9600 0.0 -
24.6803 9650 0.0 -
24.8082 9700 0.0 -
24.9361 9750 0.0 -
25.0639 9800 0.0 -
25.1918 9850 0.0 -
25.3197 9900 0.0 -
25.4476 9950 0.0 -
25.5754 10000 0.0 -
25.7033 10050 0.0 -
25.8312 10100 0.0 -
25.9591 10150 0.0 -
26.0870 10200 0.0 -
26.2148 10250 0.0 -
26.3427 10300 0.0 -
26.4706 10350 0.0 -
26.5985 10400 0.0 -
26.7263 10450 0.0 -
26.8542 10500 0.0 -
26.9821 10550 0.0 -
27.1100 10600 0.0 -
27.2379 10650 0.0 -
27.3657 10700 0.0 -
27.4936 10750 0.0 -
27.6215 10800 0.0 -
27.7494 10850 0.0 -
27.8772 10900 0.0 -
28.0051 10950 0.0 -
28.1330 11000 0.0 -
28.2609 11050 0.0 -
28.3887 11100 0.0 -
28.5166 11150 0.0 -
28.6445 11200 0.0 -
28.7724 11250 0.0 -
28.9003 11300 0.0 -
29.0281 11350 0.0 -
29.1560 11400 0.0 -
29.2839 11450 0.0 -
29.4118 11500 0.0 -
29.5396 11550 0.0 -
29.6675 11600 0.0 -
29.7954 11650 0.0 -
29.9233 11700 0.0 -

Framework Versions

  • Python: 3.10.12
  • SetFit: 1.1.0
  • Sentence Transformers: 3.3.1
  • Transformers: 4.44.2
  • PyTorch: 2.2.0a0+81ea7a4
  • Datasets: 3.2.0
  • Tokenizers: 0.19.1

Citation

BibTeX

@article{https://doi.org/10.48550/arxiv.2209.11055,
    doi = {10.48550/ARXIV.2209.11055},
    url = {https://arxiv.org/abs/2209.11055},
    author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren},
    keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
    title = {Efficient Few-Shot Learning Without Prompts},
    publisher = {arXiv},
    year = {2022},
    copyright = {Creative Commons Attribution 4.0 International}
}
Downloads last month
7,411
Safetensors
Model size
111M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for mini1013/master_cate_top_bt5_4

Base model

klue/roberta-base
Finetuned
(169)
this model

Evaluation results