File size: 16,280 Bytes
5482401
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ab2bb86
4ed3e9a
 
 
 
ab2bb86
4ed3e9a
 
 
 
 
 
 
ab2bb86
 
4ed3e9a
 
 
ab2bb86
4ed3e9a
 
5482401
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ab2bb86
5482401
 
 
4ed3e9a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
5482401
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
---

library_name: transformers
license: other
base_model: nvidia/mit-b0
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: earthquake
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# earthquake

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the gokceKy/earthquake dataset.
It achieves the following results on the evaluation set:
- Loss: 1.2868
- Mean Iou: 0.2379
- Mean Accuracy: 0.2922
- Overall Accuracy: 0.5492
- Accuracy Background: nan
- Accuracy Car: 0.0
- Accuracy Earthquake-roads: 0.4334
- Accuracy Other: 0.1291
- Accuracy Road: 0.6209
- Accuracy Road-cracks: 0.0
- Accuracy Sky: 0.8620
- Accuracy Wall: 0.0
- Iou Background: 0.0
- Iou Car: 0.0
- Iou Earthquake-roads: 0.3329
- Iou Other: 0.1229
- Iou Road: 0.5861
- Iou Road-cracks: 0.0
- Iou Sky: 0.8615
- Iou Wall: 0.0

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0001

- train_batch_size: 2

- eval_batch_size: 2

- seed: 42

- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments

- lr_scheduler_type: linear

- num_epochs: 5

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Car | Accuracy Earthquake-roads | Accuracy Other | Accuracy Road | Accuracy Road-cracks | Accuracy Sky | Accuracy Wall | Iou Background | Iou Car | Iou Earthquake-roads | Iou Other | Iou Road | Iou Road-cracks | Iou Sky | Iou Wall |
|:-------------:|:------:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:------------:|:-------------------------:|:--------------:|:-------------:|:--------------------:|:------------:|:-------------:|:--------------:|:-------:|:--------------------:|:---------:|:--------:|:---------------:|:-------:|:--------:|
| 2.0651        | 0.1316 | 5    | 2.0768          | 0.0374   | 0.1237        | 0.1091           | nan                 | 0.0          | 0.6207                    | 0.2450         | 0.0           | 0.0                  | 0.0          | 0.0           | 0.0            | 0.0     | 0.1556               | 0.1438    | 0.0      | 0.0             | 0.0     | 0.0      |
| 1.8605        | 0.2632 | 10   | 2.0532          | 0.0563   | 0.1524        | 0.2410           | nan                 | 0.0          | 0.7513                    | 0.1438         | 0.1716        | 0.0                  | 0.0          | 0.0           | 0.0            | 0.0     | 0.2042               | 0.0782    | 0.1683   | 0.0             | 0.0     | 0.0      |
| 1.7033        | 0.3947 | 15   | 2.0117          | 0.0892   | 0.1338        | 0.3878           | nan                 | 0.0          | 0.4356                    | 0.0185         | 0.4531        | 0.0                  | 0.0296       | 0.0           | 0.0            | 0.0     | 0.2332               | 0.0152    | 0.4359   | 0.0             | 0.0296  | 0.0      |
| 1.6843        | 0.5263 | 20   | 1.9639          | 0.1220   | 0.2025        | 0.4637           | nan                 | 0.0          | 0.6874                    | 0.0325         | 0.4990        | 0.0                  | 0.1983       | 0.0           | 0.0            | 0.0     | 0.2913               | 0.0309    | 0.4793   | 0.0             | 0.1748  | 0.0      |
| 1.7298        | 0.6579 | 25   | 1.8835          | 0.1234   | 0.1819        | 0.5114           | nan                 | 0.0          | 0.6336                    | 0.0555         | 0.5840        | 0.0                  | 0.0          | 0.0           | 0.0            | 0.0     | 0.3818               | 0.0542    | 0.5516   | 0.0             | 0.0     | 0.0      |
| 1.4833        | 0.7895 | 30   | 1.7512          | 0.1555   | 0.2118        | 0.6238           | nan                 | 0.0          | 0.6148                    | 0.1323         | 0.7352        | 0.0                  | 0.0          | 0.0           | 0.0            | 0.0     | 0.4310               | 0.1293    | 0.6841   | 0.0             | 0.0     | 0.0      |
| 1.5078        | 0.9211 | 35   | 1.6947          | 0.2232   | 0.3260        | 0.6338           | nan                 | 0.0          | 0.6497                    | 0.4078         | 0.6786        | 0.0                  | 0.5459       | 0.0           | 0.0            | 0.0     | 0.4381               | 0.3286    | 0.6399   | 0.0             | 0.3788  | 0.0      |
| 1.2623        | 1.0526 | 40   | 1.6373          | 0.2083   | 0.3221        | 0.5803           | nan                 | 0.0          | 0.6784                    | 0.3435         | 0.6017        | 0.0                  | 0.6309       | 0.0           | 0.0            | 0.0     | 0.4245               | 0.2883    | 0.5717   | 0.0             | 0.3819  | 0.0      |
| 1.124         | 1.1842 | 45   | 1.4797          | 0.2175   | 0.3017        | 0.6133           | nan                 | 0.0          | 0.6023                    | 0.2190         | 0.6795        | 0.0                  | 0.6111       | 0.0           | 0.0            | 0.0     | 0.4482               | 0.2035    | 0.6431   | 0.0             | 0.4454  | 0.0      |
| 1.1958        | 1.3158 | 50   | 1.3304          | 0.2445   | 0.3053        | 0.6530           | nan                 | 0.0          | 0.5340                    | 0.2660         | 0.7434        | 0.0                  | 0.5935       | 0.0           | 0.0            | 0.0     | 0.4644               | 0.2522    | 0.6977   | 0.0             | 0.5413  | 0.0      |
| 1.6012        | 1.4474 | 55   | 1.2994          | 0.2460   | 0.3082        | 0.6439           | nan                 | 0.0          | 0.5639                    | 0.3192         | 0.7202        | 0.0                  | 0.5538       | 0.0           | 0.0            | 0.0     | 0.4683               | 0.3049    | 0.6765   | 0.0             | 0.5182  | 0.0      |
| 0.8773        | 1.5789 | 60   | 1.3010          | 0.2476   | 0.3222        | 0.6103           | nan                 | 0.0          | 0.6028                    | 0.3035         | 0.6603        | 0.0                  | 0.6891       | 0.0           | 0.0            | 0.0     | 0.4475               | 0.2968    | 0.6193   | 0.0             | 0.6169  | 0.0      |
| 1.2279        | 1.7105 | 65   | 1.4635          | 0.1929   | 0.3179        | 0.4692           | nan                 | 0.0          | 0.6939                    | 0.2394         | 0.4448        | 0.0                  | 0.8469       | 0.0           | 0.0            | 0.0     | 0.3517               | 0.2280    | 0.4246   | 0.0             | 0.5389  | 0.0      |
| 1.2859        | 1.8421 | 70   | 1.4540          | 0.2030   | 0.2891        | 0.4708           | nan                 | 0.0          | 0.5973                    | 0.1369         | 0.4808        | 0.0                  | 0.8089       | 0.0           | 0.0            | 0.0     | 0.4248               | 0.1346    | 0.4546   | 0.0             | 0.6098  | 0.0      |
| 1.6452        | 1.9737 | 75   | 1.4244          | 0.2336   | 0.3038        | 0.5933           | nan                 | 0.0          | 0.5942                    | 0.1817         | 0.6531        | 0.0                  | 0.6979       | 0.0           | 0.0            | 0.0     | 0.4350               | 0.1742    | 0.6117   | 0.0             | 0.6476  | 0.0      |
| 0.9851        | 2.1053 | 80   | 1.2722          | 0.2418   | 0.3033        | 0.6973           | nan                 | 0.0          | 0.4978                    | 0.0674         | 0.8303        | 0.0                  | 0.7273       | 0.0           | 0.0            | 0.0     | 0.4238               | 0.0668    | 0.7747   | 0.0             | 0.6693  | 0.0      |
| 0.9787        | 2.2368 | 85   | 1.2686          | 0.2403   | 0.2910        | 0.6700           | nan                 | 0.0          | 0.4831                    | 0.0065         | 0.8018        | 0.0                  | 0.7458       | 0.0           | 0.0            | 0.0     | 0.4308               | 0.0064    | 0.7577   | 0.0             | 0.7276  | 0.0      |
| 0.8809        | 2.3684 | 90   | 1.2555          | 0.2511   | 0.3109        | 0.6567           | nan                 | 0.0          | 0.6051                    | 0.0292         | 0.7541        | 0.0                  | 0.7878       | 0.0           | 0.0            | 0.0     | 0.5059               | 0.0288    | 0.7246   | 0.0             | 0.7497  | 0.0      |
| 1.0033        | 2.5    | 95   | 1.2597          | 0.2597   | 0.3421        | 0.6828           | nan                 | 0.0          | 0.7784                    | 0.0607         | 0.7517        | 0.0                  | 0.8040       | 0.0           | 0.0            | 0.0     | 0.5432               | 0.0594    | 0.7306   | 0.0             | 0.7441  | 0.0      |
| 0.9275        | 2.6316 | 100  | 1.2301          | 0.2695   | 0.3516        | 0.7156           | nan                 | 0.0          | 0.7590                    | 0.1137         | 0.7954        | 0.0                  | 0.7928       | 0.0           | 0.0            | 0.0     | 0.5258               | 0.1102    | 0.7662   | 0.0             | 0.7541  | 0.0      |
| 0.8305        | 2.7632 | 105  | 1.1418          | 0.2914   | 0.3563        | 0.7436           | nan                 | 0.0          | 0.5627                    | 0.2494         | 0.8540        | 0.0                  | 0.8279       | 0.0           | 0.0            | 0.0     | 0.4717               | 0.2316    | 0.8070   | 0.0             | 0.8207  | 0.0      |
| 1.4194        | 2.8947 | 110  | 1.2171          | 0.2577   | 0.3141        | 0.6741           | nan                 | 0.0          | 0.3442                    | 0.2353         | 0.8020        | 0.0                  | 0.8172       | 0.0           | 0.0            | 0.0     | 0.3208               | 0.2179    | 0.7558   | 0.0             | 0.7671  | 0.0      |
| 0.9159        | 3.0263 | 115  | 1.5539          | 0.1992   | 0.2485        | 0.3971           | nan                 | 0.0          | 0.3474                    | 0.1528         | 0.4246        | 0.0                  | 0.8146       | 0.0           | 0.0            | 0.0     | 0.2509               | 0.1430    | 0.4104   | 0.0             | 0.7895  | 0.0      |
| 0.8513        | 3.1579 | 120  | 1.6732          | 0.1939   | 0.2713        | 0.3645           | nan                 | 0.0          | 0.6345                    | 0.1103         | 0.3273        | 0.0                  | 0.8272       | 0.0           | 0.0            | 0.0     | 0.3363               | 0.1081    | 0.3223   | 0.0             | 0.7842  | 0.0      |
| 0.655         | 3.2895 | 125  | 1.3317          | 0.2348   | 0.3000        | 0.5930           | nan                 | 0.0          | 0.4572                    | 0.1112         | 0.6802        | 0.0                  | 0.8512       | 0.0           | 0.0            | 0.0     | 0.3833               | 0.1091    | 0.6360   | 0.0             | 0.7501  | 0.0      |
| 1.1348        | 3.4211 | 130  | 1.2073          | 0.2544   | 0.3232        | 0.6840           | nan                 | 0.0          | 0.5067                    | 0.0727         | 0.8010        | 0.0                  | 0.8817       | 0.0           | 0.0            | 0.0     | 0.4064               | 0.0715    | 0.7402   | 0.0             | 0.8169  | 0.0      |
| 1.0625        | 3.5526 | 135  | 1.2138          | 0.2499   | 0.3055        | 0.6480           | nan                 | 0.0          | 0.4694                    | 0.0793         | 0.7599        | 0.0                  | 0.8299       | 0.0           | 0.0            | 0.0     | 0.3803               | 0.0778    | 0.7140   | 0.0             | 0.8275  | 0.0      |
| 0.9589        | 3.6842 | 140  | 1.3193          | 0.2404   | 0.3097        | 0.6012           | nan                 | 0.0          | 0.6269                    | 0.0484         | 0.6678        | 0.0                  | 0.8249       | 0.0           | 0.0            | 0.0     | 0.4164               | 0.0478    | 0.6404   | 0.0             | 0.8189  | 0.0      |
| 1.246         | 3.8158 | 145  | 1.3526          | 0.2097   | 0.2537        | 0.5136           | nan                 | 0.0          | 0.3748                    | 0.0373         | 0.5993        | 0.0                  | 0.7645       | 0.0           | 0.0            | 0.0     | 0.3062               | 0.0366    | 0.5704   | 0.0             | 0.7645  | 0.0      |
| 0.7836        | 3.9474 | 150  | 1.3824          | 0.1970   | 0.2338        | 0.4868           | nan                 | 0.0          | 0.2695                    | 0.0250         | 0.5844        | 0.0                  | 0.7579       | 0.0           | 0.0            | 0.0     | 0.2407               | 0.0248    | 0.5525   | 0.0             | 0.7579  | 0.0      |
| 0.8474        | 4.0789 | 155  | 1.4235          | 0.1998   | 0.2368        | 0.4725           | nan                 | 0.0          | 0.2587                    | 0.0655         | 0.5606        | 0.0                  | 0.7724       | 0.0           | 0.0            | 0.0     | 0.2365               | 0.0629    | 0.5275   | 0.0             | 0.7712  | 0.0      |
| 1.0723        | 4.2105 | 160  | 1.4177          | 0.2109   | 0.2510        | 0.4877           | nan                 | 0.0          | 0.2976                    | 0.1024         | 0.5688        | 0.0                  | 0.7884       | 0.0           | 0.0            | 0.0     | 0.2681               | 0.0978    | 0.5343   | 0.0             | 0.7868  | 0.0      |
| 1.1283        | 4.3421 | 165  | 1.4844          | 0.2020   | 0.2396        | 0.4298           | nan                 | 0.0          | 0.2979                    | 0.1040         | 0.4875        | 0.0                  | 0.7877       | 0.0           | 0.0            | 0.0     | 0.2671               | 0.1007    | 0.4621   | 0.0             | 0.7860  | 0.0      |
| 0.6614        | 4.4737 | 170  | 1.4177          | 0.2111   | 0.2522        | 0.4542           | nan                 | 0.0          | 0.3361                    | 0.1011         | 0.5131        | 0.0                  | 0.8151       | 0.0           | 0.0            | 0.0     | 0.2906               | 0.0980    | 0.4868   | 0.0             | 0.8136  | 0.0      |
| 1.0973        | 4.6053 | 175  | 1.5041          | 0.2078   | 0.2573        | 0.4288           | nan                 | 0.0          | 0.4572                    | 0.0763         | 0.4572        | 0.0                  | 0.8107       | 0.0           | 0.0            | 0.0     | 0.3406               | 0.0750    | 0.4386   | 0.0             | 0.8081  | 0.0      |
| 0.8756        | 4.7368 | 180  | 1.3542          | 0.2236   | 0.2753        | 0.4929           | nan                 | 0.0          | 0.4403                    | 0.1089         | 0.5448        | 0.0                  | 0.8332       | 0.0           | 0.0            | 0.0     | 0.3338               | 0.1051    | 0.5177   | 0.0             | 0.8324  | 0.0      |
| 0.6712        | 4.8684 | 185  | 1.3772          | 0.2232   | 0.2761        | 0.4811           | nan                 | 0.0          | 0.4525                    | 0.1373         | 0.5231        | 0.0                  | 0.8195       | 0.0           | 0.0            | 0.0     | 0.3373               | 0.1303    | 0.4993   | 0.0             | 0.8185  | 0.0      |
| 1.2096        | 5.0    | 190  | 1.2868          | 0.2379   | 0.2922        | 0.5492           | nan                 | 0.0          | 0.4334                    | 0.1291         | 0.6209        | 0.0                  | 0.8620       | 0.0           | 0.0            | 0.0     | 0.3329               | 0.1229    | 0.5861   | 0.0             | 0.8615  | 0.0      |


### Framework versions

- Transformers 4.46.2
- Pytorch 2.4.1+cpu
- Datasets 3.1.0
- Tokenizers 0.20.3