--- library_name: transformers tags: - generated_from_trainer model-index: - name: glacier_segmentation_transformer results: [] --- # glacier_segmentation_transformer This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0282 - Mean Iou: 0.9476 - Mean Accuracy: 0.9713 - Overall Accuracy: 0.9770 - Per Category Iou: [0.9615266525679848, 0.9111778317508353, 0.9702425642603907] - Per Category Accuracy: [0.982983320714055, 0.9432032649898906, 0.987791939813118] ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.00012 - train_batch_size: 400 - eval_batch_size: 32 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Per Category Iou | Per Category Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------------------------------------------------:|:------------------------------------------------------------:| | 0.0828 | 1.0 | 352 | 0.0592 | 0.9208 | 0.9565 | 0.9653 | [0.9469681453924418, 0.8581945798277895, 0.9570958935975151] | [0.9804346868205148, 0.9098953383424603, 0.979223227765012] | | 0.0653 | 2.0 | 704 | 0.0545 | 0.9243 | 0.9585 | 0.9668 | [0.9491763751018332, 0.8652493086329979, 0.9583893801723153] | [0.9756058281282557, 0.9177942565930071, 0.9822091134529419] | | 0.0615 | 3.0 | 1056 | 0.0520 | 0.9245 | 0.9576 | 0.9673 | [0.9488521139512287, 0.8629337582590615, 0.9615905436256834] | [0.9798485904989809, 0.9082150871992817, 0.9846862114785438] | | 0.0568 | 4.0 | 1408 | 0.0497 | 0.9266 | 0.9586 | 0.9682 | [0.949606783690876, 0.8679034925410897, 0.962356135274943] | [0.9740655396584907, 0.9135029366787574, 0.9881750267882855] | | 0.0553 | 5.0 | 1760 | 0.0448 | 0.9317 | 0.9624 | 0.9703 | [0.9518683104854363, 0.8791552266868088, 0.9640612798278586] | [0.9766581269222775, 0.9247480628978334, 0.9858310012185293] | | 0.0531 | 6.0 | 2112 | 0.0440 | 0.9324 | 0.9633 | 0.9704 | [0.9519743922792153, 0.8820510654279622, 0.9633058622325834] | [0.978891790566398, 0.9277520457319947, 0.9833093882099169] | | 0.0516 | 7.0 | 2464 | 0.0415 | 0.9342 | 0.9635 | 0.9713 | [0.9529858938673607, 0.8844842403407042, 0.9650597333255734] | [0.9767157615257003, 0.9266858712956193, 0.9871889860088424] | | 0.0496 | 8.0 | 2816 | 0.0406 | 0.9359 | 0.9648 | 0.9719 | [0.9555651950006734, 0.8876507922287321, 0.964435754608489] | [0.9797262558665514, 0.9292800603938921, 0.9853461182114515] | | 0.0493 | 9.0 | 3168 | 0.0393 | 0.9365 | 0.9648 | 0.9724 | [0.9556626924479936, 0.8881594759577742, 0.9657764420138888] | [0.9795360289173738, 0.927896526148365, 0.9869706285329666] | | 0.0463 | 10.0 | 3520 | 0.0391 | 0.9371 | 0.9657 | 0.9724 | [0.9536840950074162, 0.8916567818261698, 0.9658607405632947] | [0.9765900676045612, 0.9343429939730185, 0.9862598369089146] | | 0.046 | 11.0 | 3872 | 0.0379 | 0.9383 | 0.9669 | 0.9729 | [0.9544427772914603, 0.8947383943199397, 0.965840875611707] | [0.9764990841923821, 0.938799387304811, 0.9853293009539879] | | 0.0452 | 12.0 | 4224 | 0.0358 | 0.9402 | 0.9671 | 0.9739 | [0.958365142157537, 0.8951548166027431, 0.9670090163868024] | [0.9826695002020712, 0.9327967021491497, 0.985958578753622] | | 0.0452 | 13.0 | 4576 | 0.0353 | 0.9421 | 0.9697 | 0.9742 | [0.9566153552692265, 0.9041663510518988, 0.9654341965129734] | [0.9781511412717715, 0.947970829250211, 0.9829622282166149] | | 0.0433 | 14.0 | 4928 | 0.0352 | 0.9405 | 0.9668 | 0.9742 | [0.9595360452886308, 0.8947929690736943, 0.9672666207041147] | [0.9845407781679891, 0.9292349314756656, 0.9867658421113708] | | 0.0431 | 15.0 | 5280 | 0.0332 | 0.9429 | 0.9683 | 0.9751 | [0.9590021679605092, 0.9009428341327679, 0.9686646519537535] | [0.9808890258011358, 0.9357498662877368, 0.9883632103243261] | | 0.0416 | 16.0 | 5632 | 0.0327 | 0.9434 | 0.9688 | 0.9753 | [0.9597417369530555, 0.901854332222541, 0.9686551773513613] | [0.9831348612466031, 0.9360674756485265, 0.9871674788190165] | | 0.0414 | 17.0 | 5984 | 0.0319 | 0.9442 | 0.9695 | 0.9755 | [0.9600470959530699, 0.9043351787260081, 0.9681169153162615] | [0.982250790235736, 0.9399363046395433, 0.9864601258953718] | | 0.0404 | 18.0 | 6336 | 0.0316 | 0.9446 | 0.9701 | 0.9755 | [0.9589776803135457, 0.9068715131060119, 0.9679329085470226] | [0.9802443827939673, 0.9437927758334433, 0.9862343091520727] | | 0.0398 | 19.0 | 6688 | 0.0314 | 0.9446 | 0.9699 | 0.9757 | [0.9588814103780426, 0.905637545251067, 0.9692458875824727] | [0.98042751220199, 0.9420745529130178, 0.987227866073119] | | 0.0386 | 20.0 | 7040 | 0.0303 | 0.9455 | 0.9698 | 0.9762 | [0.9601095729129755, 0.9062431533997868, 0.9700239091770606] | [0.9816721896884353, 0.9390483599830692, 0.9887402730113751] | | 0.0389 | 21.0 | 7392 | 0.0293 | 0.9467 | 0.9709 | 0.9766 | [0.9608462864680724, 0.9092665750526754, 0.9699517538184947] | [0.9825323474452577, 0.9427535729381092, 0.9874417479885407] | | 0.0381 | 22.0 | 7744 | 0.0289 | 0.9470 | 0.9711 | 0.9767 | [0.9613365594140901, 0.9099925770562975, 0.969649897217598] | [0.9831566934116577, 0.9431333306388411, 0.9870399712829142] | | 0.0376 | 23.0 | 8096 | 0.0286 | 0.9473 | 0.9713 | 0.9769 | [0.9609977973725258, 0.9109702591251677, 0.970032992637466] | [0.9821836494233457, 0.9442533383545785, 0.98751595129339] | | 0.0375 | 24.0 | 8448 | 0.0283 | 0.9472 | 0.9709 | 0.9769 | [0.9616875865568749, 0.9094524903163835, 0.9705179691807947] | [0.9833220629090078, 0.940861432490088, 0.9883972648331962] | | 0.0363 | 25.0 | 8800 | 0.0282 | 0.9476 | 0.9713 | 0.9770 | [0.9615266525679848, 0.9111778317508353, 0.9702425642603907] | [0.982983320714055, 0.9432032649898906, 0.987791939813118] | ### Framework versions - Transformers 4.45.1 - Pytorch 2.4.1+cu121 - Datasets 3.0.1 - Tokenizers 0.20.0