File size: 9,351 Bytes
089a285
 
 
 
 
 
 
 
bf462ab
02e9036
bf462ab
089a285
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
bf462ab
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
---
license: other
base_model: nvidia/segformer-b0-finetuned-ade-512-512
tags:
- generated_from_trainer
model-index:
- name: segformer-breastcancer
  results: []
datasets:
- as-cle-bert/breastcancer-semantic-segmentation
pipeline_tag: image-segmentation
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-breastcancer

This model is a fine-tuned version of [nvidia/segformer-b0-finetuned-ade-512-512](https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1986
- Mean Iou: 0.4951
- Mean Accuracy: 0.5647
- Overall Accuracy: 0.5716
- Per Category Iou: [0.41886373003284666, 0.5713219432574086]
- Per Category Accuracy: [0.542773911636187, 0.5866474640793707]

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou                           | Per Category Accuracy                      |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------------------------------:|:------------------------------------------:|
| 0.9179        | 1.25  | 20   | 0.8275          | 0.1056   | 0.2990        | 0.2215           | [0.15928433223106872, 0.05189369644942194] | [0.5449101796407185, 0.053152424747755486] |
| 0.7951        | 2.5   | 40   | 0.7554          | 0.3808   | 0.6154        | 0.6539           | [0.2962250026735109, 0.46535774064135604]  | [0.4931218643793494, 0.7375983290380178]   |
| 0.6317        | 3.75  | 60   | 0.5784          | 0.2076   | 0.3576        | 0.3005           | [0.24602488191071786, 0.16910477266308951] | [0.5386308464152776, 0.17651220374955784]  |
| 0.5525        | 5.0   | 80   | 0.4935          | 0.3310   | 0.4279        | 0.3908           | [0.3572223576675606, 0.30487703968490387]  | [0.5453956950962939, 0.31031549514039786]  |
| 0.4365        | 6.25  | 100  | 0.4277          | 0.4259   | 0.5007        | 0.5093           | [0.3753112405986087, 0.4765198093920762]   | [0.473248098397799, 0.528071150639244]     |
| 0.3658        | 7.5   | 120  | 0.3757          | 0.3739   | 0.4207        | 0.4501           | [0.2934911929427469, 0.45430117531467024]  | [0.32736688784592977, 0.5140397864133273]  |
| 0.357         | 8.75  | 140  | 0.3155          | 0.4305   | 0.5273        | 0.5652           | [0.31276016750127367, 0.5482260296446353]  | [0.40734746722770676, 0.6473124799973049]  |
| 0.2889        | 10.0  | 160  | 0.3121          | 0.4761   | 0.5439        | 0.5495           | [0.39972203089638886, 0.5525428502787649]  | [0.5259588930247613, 0.56174305590648]     |
| 0.2536        | 11.25 | 180  | 0.2611          | 0.4607   | 0.5411        | 0.5586           | [0.37248963582652733, 0.5489196143472734]  | [0.4856772940605276, 0.5965098455370829]   |
| 0.3375        | 12.5  | 200  | 0.2522          | 0.3905   | 0.4676        | 0.4535           | [0.3615823724169426, 0.4193968866718472]   | [0.512558666450882, 0.4227348526959422]    |
| 0.1835        | 13.75 | 220  | 0.2393          | 0.4343   | 0.4809        | 0.5004           | [0.3816968232451229, 0.4869246466631396]   | [0.41924259588930246, 0.5425994239223811]  |
| 0.1878        | 15.0  | 240  | 0.2364          | 0.3883   | 0.4769        | 0.4591           | [0.3594858252766199, 0.4170536161683648]   | [0.5331607056157954, 0.42058719490626106]  |
| 0.1804        | 16.25 | 260  | 0.2388          | 0.3503   | 0.4221        | 0.3934           | [0.3722368961671656, 0.3283766624340039]   | [0.5131736526946108, 0.3310593427324945]   |
| 0.2296        | 17.5  | 280  | 0.2108          | 0.3845   | 0.4523        | 0.4383           | [0.36382381172455475, 0.4051134890024848]  | [0.4968765172357987, 0.40781915879192143]  |
| 0.1752        | 18.75 | 300  | 0.2065          | 0.4408   | 0.5307        | 0.5278           | [0.37362255868123995, 0.5080655748465653]  | [0.539941738145331, 0.5215102666464534]    |
| 0.1404        | 20.0  | 320  | 0.2025          | 0.4192   | 0.5049        | 0.4948           | [0.37603680369849973, 0.4624047452321127]  | [0.5370771969574365, 0.4727289571647548]   |
| 0.1044        | 21.25 | 340  | 0.1993          | 0.4134   | 0.5006        | 0.4938           | [0.36164057945015027, 0.46514651056315]    | [0.5219938501375627, 0.4791635083463877]   |
| 0.1047        | 22.5  | 360  | 0.1995          | 0.4409   | 0.5612        | 0.5654           | [0.35316826827766823, 0.5286988461568266]  | [0.5477909046771322, 0.5746205804571564]   |
| 0.0969        | 23.75 | 380  | 0.1934          | 0.4208   | 0.5256        | 0.5171           | [0.3610564616784075, 0.480532337904731]    | [0.5524356692021363, 0.49872824970101237]  |
| 0.1198        | 25.0  | 400  | 0.2100          | 0.4047   | 0.4892        | 0.4726           | [0.377810637529348, 0.43159533203482664]   | [0.5416895937854022, 0.4366988394225748]   |
| 0.116         | 26.25 | 420  | 0.2038          | 0.4208   | 0.5123        | 0.5040           | [0.3659432240473206, 0.47558361909786334]  | [0.5386632141123159, 0.48590968046220967]  |
| 0.0803        | 27.5  | 440  | 0.2035          | 0.4643   | 0.5486        | 0.5520           | [0.3885018236229309, 0.5400125204269953]   | [0.537854021686357, 0.5594101099937676]    |
| 0.1031        | 28.75 | 460  | 0.2068          | 0.4193   | 0.5268        | 0.5199           | [0.3565531095848628, 0.48207738324971056]  | [0.5486324648001295, 0.5049522461973824]   |
| 0.0652        | 30.0  | 480  | 0.1906          | 0.4799   | 0.5572        | 0.5719           | [0.39256244632789455, 0.5671483599490623]  | [0.5104709499919081, 0.6039045260835144]   |
| 0.0865        | 31.25 | 500  | 0.1946          | 0.4660   | 0.5319        | 0.5360           | [0.4022848534304187, 0.5297039831736081]   | [0.5185952419485353, 0.5451176579581248]   |
| 0.0781        | 32.5  | 520  | 0.2018          | 0.4170   | 0.4977        | 0.4881           | [0.37508619500758517, 0.4588260589120619]  | [0.5281922641204079, 0.46729664628497314]  |
| 0.0922        | 33.75 | 540  | 0.1932          | 0.4649   | 0.5558        | 0.5608           | [0.39512968947922955, 0.5346638407173079]  | [0.5401683120245995, 0.571521215490087]    |
| 0.0802        | 35.0  | 560  | 0.2029          | 0.4519   | 0.5364        | 0.5344           | [0.3877223005943433, 0.5161263869184783]   | [0.5426606246965529, 0.5300756312429464]   |
| 0.0737        | 36.25 | 580  | 0.1983          | 0.4605   | 0.5598        | 0.5666           | [0.3930664524057094, 0.5280028151990147]   | [0.5383719048389707, 0.5812993750736941]   |
| 0.0766        | 37.5  | 600  | 0.2097          | 0.4902   | 0.5645        | 0.5701           | [0.41298901286924217, 0.5674679408239331]  | [0.5468846091600582, 0.5821500160021561]   |
| 0.0663        | 38.75 | 620  | 0.1926          | 0.5041   | 0.5653        | 0.5781           | [0.42229021548076295, 0.5859655697770101]  | [0.5249069428710147, 0.6057405629390065]   |
| 0.0572        | 40.0  | 640  | 0.1944          | 0.4884   | 0.5550        | 0.5643           | [0.41379925802215733, 0.5630840363400389]  | [0.525295355235475, 0.5846429834756683]    |
| 0.1065        | 41.25 | 660  | 0.1949          | 0.4713   | 0.5603        | 0.5687           | [0.4052270716602772, 0.537297205601135]    | [0.5337271403139666, 0.5868664409520441]   |
| 0.0881        | 42.5  | 680  | 0.1945          | 0.4557   | 0.5355        | 0.5362           | [0.38861418270649184, 0.5228113541121006]  | [0.5329341317365269, 0.5379672208465983]   |
| 0.0616        | 43.75 | 700  | 0.2055          | 0.4851   | 0.5479        | 0.5493           | [0.4288067420034476, 0.5413945423770796]   | [0.543486000971031, 0.5522512506948305]    |
| 0.135         | 45.0  | 720  | 0.2017          | 0.4950   | 0.5702        | 0.5770           | [0.4186215922560253, 0.5714192766576933]   | [0.5487133840427254, 0.5917428874627318]   |
| 0.0683        | 46.25 | 740  | 0.1986          | 0.4880   | 0.5579        | 0.5633           | [0.41617258731503165, 0.5599071727881785]  | [0.5407347467227707, 0.5750585342025031]   |
| 0.0962        | 47.5  | 760  | 0.2010          | 0.4907   | 0.5660        | 0.5730           | [0.41037067786677084, 0.571094427269902]   | [0.543955332578087, 0.5881213468762106]    |
| 0.0534        | 48.75 | 780  | 0.2061          | 0.4941   | 0.5671        | 0.5740           | [0.4158937943809818, 0.5723742349360128]   | [0.5450234665803528, 0.5891404315528829]   |
| 0.069         | 50.0  | 800  | 0.1986          | 0.4951   | 0.5647        | 0.5716           | [0.41886373003284666, 0.5713219432574086]  | [0.542773911636187, 0.5866474640793707]    |


### Framework versions

- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2