2023-02-19 04:01:09,440 - mmseg - INFO - Multi-processing start method is `None`
2023-02-19 04:01:09,441 - mmseg - INFO - OpenCV num_threads is `112
2023-02-19 04:01:09,441 - mmseg - INFO - OMP num threads is 1
2023-02-19 04:01:09,478 - mmseg - INFO - Environment info:
------------------------------------------------------------
sys.platform: linux
Python: 3.7.13 (default, Mar 29 2022, 02:18:16) [GCC 7.5.0]
CUDA available: True
GPU 0,1,2,3,4,5,6,7: A100-SXM-80GB
CUDA_HOME: /usr/local/cuda
NVCC: Build cuda_11.2.r11.2/compiler.29618528_0
GCC: gcc (GCC) 5.4.0
PyTorch: 1.9.0+cu111
PyTorch compiling details: PyTorch built with:
  - GCC 7.3
  - C++ Version: 201402
  - Intel(R) Math Kernel Library Version 2020.0.0 Product Build 20191122 for Intel(R) 64 architecture applications
  - Intel(R) MKL-DNN v2.1.2 (Git Hash 98be7e8afa711dc9b66c8ff3504129cb82013cdb)
  - OpenMP 201511 (a.k.a. OpenMP 4.5)
  - NNPACK is enabled
  - CPU capability usage: AVX2
  - CUDA Runtime 11.1
  - NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86
  - CuDNN 8.0.5
  - Magma 2.5.2
  - Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.1, CUDNN_VERSION=8.0.5, CXX_COMPILER=/opt/rh/devtoolset-7/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-sign-compare -Wno-unused-parameter -Wno-unused-variable -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.9.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=ON, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, 

TorchVision: 0.10.0+cu111
OpenCV: 4.6.0
MMCV: 1.4.2
MMCV Compiler: GCC 7.3
MMCV CUDA Compiler: 11.1
MMSegmentation: 0.29.0+
------------------------------------------------------------

2023-02-19 04:01:09,479 - mmseg - INFO - Distributed training: True
2023-02-19 04:01:09,952 - mmseg - INFO - Config:
dataset_type = 'ADE20KDataset'
data_root = 'data/ade/ADEChallengeData2016'
img_norm_cfg = dict(
    mean=[123.675, 116.28, 103.53], std=[58.395, 57.12, 57.375], to_rgb=True)
crop_size = (512, 512)
train_pipeline = [
    dict(type='LoadImageFromFile'),
    dict(type='LoadAnnotations', reduce_zero_label=True),
    dict(type='Resize', img_scale=(2048, 512), ratio_range=(0.5, 2.0)),
    dict(type='RandomCrop', crop_size=(512, 512), cat_max_ratio=0.75),
    dict(type='RandomFlip', prob=0.5),
    dict(type='PhotoMetricDistortion'),
    dict(
        type='Normalize',
        mean=[123.675, 116.28, 103.53],
        std=[58.395, 57.12, 57.375],
        to_rgb=True),
    dict(type='Pad', size=(512, 512), pad_val=0, seg_pad_val=255),
    dict(type='DefaultFormatBundle'),
    dict(type='Collect', keys=['img', 'gt_semantic_seg'])
]
test_pipeline = [
    dict(type='LoadImageFromFile'),
    dict(
        type='MultiScaleFlipAug',
        img_scale=(2048, 512),
        flip=False,
        transforms=[
            dict(type='Resize', keep_ratio=True),
            dict(type='RandomFlip'),
            dict(
                type='Normalize',
                mean=[123.675, 116.28, 103.53],
                std=[58.395, 57.12, 57.375],
                to_rgb=True),
            dict(type='ImageToTensor', keys=['img']),
            dict(type='Collect', keys=['img'])
        ])
]
data = dict(
    samples_per_gpu=2,
    workers_per_gpu=2,
    train=dict(
        type='ADE20KDataset',
        data_root='data/ade/ADEChallengeData2016',
        img_dir='images/training',
        ann_dir='annotations/training',
        pipeline=[
            dict(type='LoadImageFromFile'),
            dict(type='LoadAnnotations', reduce_zero_label=True),
            dict(type='Resize', img_scale=(2048, 512), ratio_range=(0.5, 2.0)),
            dict(type='RandomCrop', crop_size=(512, 512), cat_max_ratio=0.75),
            dict(type='RandomFlip', prob=0.5),
            dict(type='PhotoMetricDistortion'),
            dict(
                type='Normalize',
                mean=[123.675, 116.28, 103.53],
                std=[58.395, 57.12, 57.375],
                to_rgb=True),
            dict(type='Pad', size=(512, 512), pad_val=0, seg_pad_val=255),
            dict(type='DefaultFormatBundle'),
            dict(type='Collect', keys=['img', 'gt_semantic_seg'])
        ]),
    val=dict(
        type='ADE20KDataset',
        data_root='data/ade/ADEChallengeData2016',
        img_dir='images/validation',
        ann_dir='annotations/validation',
        pipeline=[
            dict(type='LoadImageFromFile'),
            dict(
                type='MultiScaleFlipAug',
                img_scale=(2048, 512),
                flip=False,
                transforms=[
                    dict(type='Resize', keep_ratio=True),
                    dict(type='RandomFlip'),
                    dict(
                        type='Normalize',
                        mean=[123.675, 116.28, 103.53],
                        std=[58.395, 57.12, 57.375],
                        to_rgb=True),
                    dict(type='ImageToTensor', keys=['img']),
                    dict(type='Collect', keys=['img'])
                ])
        ]),
    test=dict(
        type='ADE20KDataset',
        data_root='data/ade/ADEChallengeData2016',
        img_dir='images/validation',
        ann_dir='annotations/validation',
        pipeline=[
            dict(type='LoadImageFromFile'),
            dict(
                type='MultiScaleFlipAug',
                img_scale=(2048, 512),
                flip=False,
                transforms=[
                    dict(type='Resize', keep_ratio=True),
                    dict(type='RandomFlip'),
                    dict(
                        type='Normalize',
                        mean=[123.675, 116.28, 103.53],
                        std=[58.395, 57.12, 57.375],
                        to_rgb=True),
                    dict(type='ImageToTensor', keys=['img']),
                    dict(type='Collect', keys=['img'])
                ])
        ]))
log_config = dict(
    interval=50, hooks=[dict(type='TextLoggerHook', by_epoch=False)])
dist_params = dict(backend='nccl')
log_level = 'INFO'
load_from = None
resume_from = None
workflow = [('train', 1)]
cudnn_benchmark = True
optimizer = dict(
    type='AdamW',
    lr=6e-05,
    betas=(0.9, 0.999),
    weight_decay=0.01,
    paramwise_cfg=dict(
        custom_keys=dict(
            pos_block=dict(decay_mult=0.0), norm=dict(decay_mult=0.0))))
optimizer_config = dict(grad_clip=dict(max_norm=0.1, norm_type=2))
lr_config = dict(
    policy='poly',
    warmup='linear',
    warmup_iters=1500,
    warmup_ratio=1e-06,
    power=1.0,
    min_lr=0.0,
    by_epoch=False)
runner = dict(type='IterBasedRunner', max_iters=160000)
checkpoint_config = dict(by_epoch=False, interval=1000, max_keep_ckpts=1)
evaluation = dict(
    interval=16000, metric='mIoU', pre_eval=True, save_best='mIoU')
checkpoint_file = 'https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window12_384_22k_20220412-6580f57d.pth'
norm_cfg = dict(type='SyncBN', requires_grad=True)
backbone_norm_cfg = dict(type='LN', requires_grad=True)
model = dict(
    type='DiffSegV20',
    bit_scale=0.01,
    pretrained=None,
    backbone=dict(
        type='SwinTransformer',
        init_cfg=dict(
            type='Pretrained',
            checkpoint=
            'https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window12_384_22k_20220412-6580f57d.pth'
        ),
        pretrain_img_size=384,
        in_channels=3,
        embed_dims=192,
        patch_size=4,
        window_size=12,
        mlp_ratio=4,
        depths=[2, 2, 18, 2],
        num_heads=[6, 12, 24, 48],
        strides=(4, 2, 2, 2),
        out_indices=(0, 1, 2, 3),
        qkv_bias=True,
        qk_scale=None,
        patch_norm=True,
        drop_rate=0.0,
        attn_drop_rate=0.0,
        drop_path_rate=0.3,
        use_abs_pos_embed=False,
        act_cfg=dict(type='GELU'),
        norm_cfg=dict(type='LN', requires_grad=True)),
    neck=[
        dict(
            type='FPN',
            in_channels=[192, 384, 768, 1536],
            out_channels=256,
            act_cfg=None,
            norm_cfg=dict(type='GN', num_groups=32),
            num_outs=4),
        dict(
            type='MultiStageMerging',
            in_channels=[256, 256, 256, 256],
            out_channels=256,
            kernel_size=1,
            norm_cfg=dict(type='GN', num_groups=32),
            act_cfg=None)
    ],
    auxiliary_head=dict(
        type='FCNHead',
        in_channels=256,
        in_index=0,
        channels=256,
        num_convs=1,
        concat_input=False,
        dropout_ratio=0.1,
        num_classes=150,
        norm_cfg=dict(type='SyncBN', requires_grad=True),
        align_corners=False,
        loss_decode=dict(
            type='CrossEntropyLoss', use_sigmoid=False, loss_weight=0.4)),
    decode_head=dict(
        type='DeformableHeadWithTime',
        in_channels=[256],
        channels=256,
        in_index=[0],
        dropout_ratio=0.0,
        num_classes=150,
        norm_cfg=dict(type='SyncBN', requires_grad=True),
        align_corners=False,
        num_feature_levels=1,
        encoder=dict(
            type='DetrTransformerEncoder',
            num_layers=6,
            transformerlayers=dict(
                type='BaseTransformerLayer',
                use_time_mlp=True,
                attn_cfgs=dict(
                    type='MultiScaleDeformableAttention',
                    embed_dims=256,
                    num_levels=1,
                    num_heads=8,
                    dropout=0.0),
                ffn_cfgs=dict(
                    type='FFN',
                    embed_dims=256,
                    feedforward_channels=1024,
                    ffn_drop=0.0,
                    act_cfg=dict(type='GELU')),
                operation_order=('self_attn', 'norm', 'ffn', 'norm'))),
        positional_encoding=dict(
            type='SinePositionalEncoding',
            num_feats=128,
            normalize=True,
            offset=-0.5),
        loss_decode=dict(
            type='CrossEntropyLoss', use_sigmoid=False, loss_weight=1.0)),
    train_cfg=dict(),
    test_cfg=dict(mode='whole'))
work_dir = './work_dirs/diffseg_swin_l_2x8_512x512_160k_ade20k_v20'
gpu_ids = range(0, 8)
auto_resume = False

2023-02-19 04:01:16,626 - mmseg - INFO - Set random seed to 1094882664, deterministic: True
2023-02-19 04:01:17,813 - mmseg - INFO - load checkpoint from http path: https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window12_384_22k_20220412-6580f57d.pth
2023-02-19 04:03:41,895 - mmseg - WARNING - The model and loaded state dict do not match exactly

unexpected key in source state_dict: norm.weight, norm.bias, stages.0.blocks.1.attn_mask, stages.1.blocks.1.attn_mask, stages.2.blocks.1.attn_mask, stages.2.blocks.3.attn_mask, stages.2.blocks.5.attn_mask, stages.2.blocks.7.attn_mask, stages.2.blocks.9.attn_mask, stages.2.blocks.11.attn_mask, stages.2.blocks.13.attn_mask, stages.2.blocks.15.attn_mask, stages.2.blocks.17.attn_mask

missing keys in source state_dict: norm0.weight, norm0.bias, norm1.weight, norm1.bias, norm2.weight, norm2.bias, norm3.weight, norm3.bias

2023-02-19 04:03:41,960 - mmseg - INFO - initialize FPN with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'}
2023-02-19 04:03:41,981 - mmseg - INFO - initialize MultiStageMerging with init_cfg {'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'}
2023-02-19 04:03:42,035 - mmseg - INFO - initialize FCNHead with init_cfg {'type': 'Normal', 'std': 0.01, 'override': {'name': 'conv_seg'}}
Name of parameter - Initialization information

backbone.patch_embed.projection.weight - torch.Size([192, 3, 4, 4]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.patch_embed.projection.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.patch_embed.norm.weight - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.patch_embed.norm.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.norm1.weight - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.norm1.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([529, 6]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.attn.w_msa.qkv.weight - torch.Size([576, 192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.attn.w_msa.qkv.bias - torch.Size([576]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.attn.w_msa.proj.weight - torch.Size([192, 192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.attn.w_msa.proj.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.norm2.weight - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.norm2.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.ffn.layers.0.0.weight - torch.Size([768, 192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.ffn.layers.0.0.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.ffn.layers.1.weight - torch.Size([192, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.0.ffn.layers.1.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.norm1.weight - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.norm1.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([529, 6]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.attn.w_msa.qkv.weight - torch.Size([576, 192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.attn.w_msa.qkv.bias - torch.Size([576]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.attn.w_msa.proj.weight - torch.Size([192, 192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.attn.w_msa.proj.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.norm2.weight - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.norm2.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.ffn.layers.0.0.weight - torch.Size([768, 192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.ffn.layers.0.0.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.ffn.layers.1.weight - torch.Size([192, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.blocks.1.ffn.layers.1.bias - torch.Size([192]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.downsample.norm.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.downsample.norm.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.0.downsample.reduction.weight - torch.Size([384, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.norm1.weight - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.norm1.bias - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([529, 12]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.attn.w_msa.qkv.weight - torch.Size([1152, 384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.attn.w_msa.qkv.bias - torch.Size([1152]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.attn.w_msa.proj.weight - torch.Size([384, 384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.attn.w_msa.proj.bias - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.norm2.weight - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.norm2.bias - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.ffn.layers.0.0.weight - torch.Size([1536, 384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.ffn.layers.0.0.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.ffn.layers.1.weight - torch.Size([384, 1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.0.ffn.layers.1.bias - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.norm1.weight - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.norm1.bias - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([529, 12]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.attn.w_msa.qkv.weight - torch.Size([1152, 384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.attn.w_msa.qkv.bias - torch.Size([1152]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.attn.w_msa.proj.weight - torch.Size([384, 384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.attn.w_msa.proj.bias - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.norm2.weight - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.norm2.bias - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.ffn.layers.0.0.weight - torch.Size([1536, 384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.ffn.layers.0.0.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.ffn.layers.1.weight - torch.Size([384, 1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.blocks.1.ffn.layers.1.bias - torch.Size([384]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.downsample.norm.weight - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.downsample.norm.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.1.downsample.reduction.weight - torch.Size([768, 1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.0.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.1.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.2.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.3.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.4.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.5.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.6.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.7.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.8.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.9.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.10.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.11.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.12.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.13.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.14.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.15.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.16.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.norm1.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.norm1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.attn.w_msa.relative_position_bias_table - torch.Size([529, 24]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.attn.w_msa.qkv.weight - torch.Size([2304, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.attn.w_msa.qkv.bias - torch.Size([2304]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.attn.w_msa.proj.weight - torch.Size([768, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.attn.w_msa.proj.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.norm2.weight - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.norm2.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.ffn.layers.0.0.weight - torch.Size([3072, 768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.ffn.layers.0.0.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.ffn.layers.1.weight - torch.Size([768, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.blocks.17.ffn.layers.1.bias - torch.Size([768]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.downsample.norm.weight - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.downsample.norm.bias - torch.Size([3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.2.downsample.reduction.weight - torch.Size([1536, 3072]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.norm1.weight - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.norm1.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.attn.w_msa.relative_position_bias_table - torch.Size([529, 48]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.attn.w_msa.qkv.weight - torch.Size([4608, 1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.attn.w_msa.qkv.bias - torch.Size([4608]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.attn.w_msa.proj.weight - torch.Size([1536, 1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.attn.w_msa.proj.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.norm2.weight - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.norm2.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.ffn.layers.0.0.weight - torch.Size([6144, 1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.ffn.layers.0.0.bias - torch.Size([6144]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.ffn.layers.1.weight - torch.Size([1536, 6144]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.0.ffn.layers.1.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.norm1.weight - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.norm1.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.attn.w_msa.relative_position_bias_table - torch.Size([529, 48]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.attn.w_msa.qkv.weight - torch.Size([4608, 1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.attn.w_msa.qkv.bias - torch.Size([4608]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.attn.w_msa.proj.weight - torch.Size([1536, 1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.attn.w_msa.proj.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.norm2.weight - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.norm2.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.ffn.layers.0.0.weight - torch.Size([6144, 1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.ffn.layers.0.0.bias - torch.Size([6144]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.ffn.layers.1.weight - torch.Size([1536, 6144]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.stages.3.blocks.1.ffn.layers.1.bias - torch.Size([1536]): 
Initialized by user-defined `init_weights` in SwinTransformer  

backbone.norm0.weight - torch.Size([192]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

backbone.norm0.bias - torch.Size([192]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

backbone.norm1.weight - torch.Size([384]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

backbone.norm1.bias - torch.Size([384]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

backbone.norm2.weight - torch.Size([768]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

backbone.norm2.bias - torch.Size([768]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

backbone.norm3.weight - torch.Size([1536]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

backbone.norm3.bias - torch.Size([1536]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.lateral_convs.0.conv.weight - torch.Size([256, 192, 1, 1]): 
XavierInit: gain=1, distribution=uniform, bias=0 

neck.0.lateral_convs.0.gn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.lateral_convs.0.gn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.lateral_convs.1.conv.weight - torch.Size([256, 384, 1, 1]): 
XavierInit: gain=1, distribution=uniform, bias=0 

neck.0.lateral_convs.1.gn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.lateral_convs.1.gn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.lateral_convs.2.conv.weight - torch.Size([256, 768, 1, 1]): 
XavierInit: gain=1, distribution=uniform, bias=0 

neck.0.lateral_convs.2.gn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.lateral_convs.2.gn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.lateral_convs.3.conv.weight - torch.Size([256, 1536, 1, 1]): 
XavierInit: gain=1, distribution=uniform, bias=0 

neck.0.lateral_convs.3.gn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.lateral_convs.3.gn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.fpn_convs.0.conv.weight - torch.Size([256, 256, 3, 3]): 
XavierInit: gain=1, distribution=uniform, bias=0 

neck.0.fpn_convs.0.gn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.fpn_convs.0.gn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.fpn_convs.1.conv.weight - torch.Size([256, 256, 3, 3]): 
XavierInit: gain=1, distribution=uniform, bias=0 

neck.0.fpn_convs.1.gn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.fpn_convs.1.gn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.fpn_convs.2.conv.weight - torch.Size([256, 256, 3, 3]): 
XavierInit: gain=1, distribution=uniform, bias=0 

neck.0.fpn_convs.2.gn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.fpn_convs.2.gn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.fpn_convs.3.conv.weight - torch.Size([256, 256, 3, 3]): 
XavierInit: gain=1, distribution=uniform, bias=0 

neck.0.fpn_convs.3.gn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.0.fpn_convs.3.gn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.1.down.conv.weight - torch.Size([256, 1024, 1, 1]): 
Initialized by user-defined `init_weights` in ConvModule  

neck.1.down.gn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

neck.1.down.gn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.conv_seg.weight - torch.Size([150, 256, 1, 1]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.conv_seg.bias - torch.Size([150]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.attentions.0.sampling_offsets.weight - torch.Size([64, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.attentions.0.sampling_offsets.bias - torch.Size([64]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.attentions.0.attention_weights.weight - torch.Size([32, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.attentions.0.attention_weights.bias - torch.Size([32]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.attentions.0.value_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.0.attentions.0.value_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.attentions.0.output_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.0.attentions.0.output_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.time_mlp.1.weight - torch.Size([512, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.0.time_mlp.1.bias - torch.Size([512]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.ffns.0.layers.0.0.weight - torch.Size([1024, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.0.ffns.0.layers.0.0.bias - torch.Size([1024]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.ffns.0.layers.1.weight - torch.Size([256, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.0.ffns.0.layers.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.norms.0.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.norms.0.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.norms.1.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.0.norms.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.attentions.0.sampling_offsets.weight - torch.Size([64, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.attentions.0.sampling_offsets.bias - torch.Size([64]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.attentions.0.attention_weights.weight - torch.Size([32, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.attentions.0.attention_weights.bias - torch.Size([32]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.attentions.0.value_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.1.attentions.0.value_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.attentions.0.output_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.1.attentions.0.output_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.time_mlp.1.weight - torch.Size([512, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.1.time_mlp.1.bias - torch.Size([512]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.ffns.0.layers.0.0.weight - torch.Size([1024, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.1.ffns.0.layers.0.0.bias - torch.Size([1024]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.ffns.0.layers.1.weight - torch.Size([256, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.1.ffns.0.layers.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.norms.0.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.norms.0.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.norms.1.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.1.norms.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.attentions.0.sampling_offsets.weight - torch.Size([64, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.attentions.0.sampling_offsets.bias - torch.Size([64]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.attentions.0.attention_weights.weight - torch.Size([32, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.attentions.0.attention_weights.bias - torch.Size([32]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.attentions.0.value_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.2.attentions.0.value_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.attentions.0.output_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.2.attentions.0.output_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.time_mlp.1.weight - torch.Size([512, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.2.time_mlp.1.bias - torch.Size([512]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.ffns.0.layers.0.0.weight - torch.Size([1024, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.2.ffns.0.layers.0.0.bias - torch.Size([1024]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.ffns.0.layers.1.weight - torch.Size([256, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.2.ffns.0.layers.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.norms.0.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.norms.0.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.norms.1.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.2.norms.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.attentions.0.sampling_offsets.weight - torch.Size([64, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.attentions.0.sampling_offsets.bias - torch.Size([64]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.attentions.0.attention_weights.weight - torch.Size([32, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.attentions.0.attention_weights.bias - torch.Size([32]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.attentions.0.value_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.3.attentions.0.value_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.attentions.0.output_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.3.attentions.0.output_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.time_mlp.1.weight - torch.Size([512, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.3.time_mlp.1.bias - torch.Size([512]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.ffns.0.layers.0.0.weight - torch.Size([1024, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.3.ffns.0.layers.0.0.bias - torch.Size([1024]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.ffns.0.layers.1.weight - torch.Size([256, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.3.ffns.0.layers.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.norms.0.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.norms.0.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.norms.1.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.3.norms.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.attentions.0.sampling_offsets.weight - torch.Size([64, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.attentions.0.sampling_offsets.bias - torch.Size([64]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.attentions.0.attention_weights.weight - torch.Size([32, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.attentions.0.attention_weights.bias - torch.Size([32]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.attentions.0.value_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.4.attentions.0.value_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.attentions.0.output_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.4.attentions.0.output_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.time_mlp.1.weight - torch.Size([512, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.4.time_mlp.1.bias - torch.Size([512]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.ffns.0.layers.0.0.weight - torch.Size([1024, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.4.ffns.0.layers.0.0.bias - torch.Size([1024]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.ffns.0.layers.1.weight - torch.Size([256, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.4.ffns.0.layers.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.norms.0.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.norms.0.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.norms.1.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.4.norms.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.attentions.0.sampling_offsets.weight - torch.Size([64, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.attentions.0.sampling_offsets.bias - torch.Size([64]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.attentions.0.attention_weights.weight - torch.Size([32, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.attentions.0.attention_weights.bias - torch.Size([32]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.attentions.0.value_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.5.attentions.0.value_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.attentions.0.output_proj.weight - torch.Size([256, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.5.attentions.0.output_proj.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.time_mlp.1.weight - torch.Size([512, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.5.time_mlp.1.bias - torch.Size([512]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.ffns.0.layers.0.0.weight - torch.Size([1024, 256]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.5.ffns.0.layers.0.0.bias - torch.Size([1024]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.ffns.0.layers.1.weight - torch.Size([256, 1024]): 
Initialized by user-defined `init_weights` in DeformableHeadWithTime  

decode_head.encoder.layers.5.ffns.0.layers.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.norms.0.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.norms.0.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.norms.1.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

decode_head.encoder.layers.5.norms.1.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

auxiliary_head.conv_seg.weight - torch.Size([150, 256, 1, 1]): 
NormalInit: mean=0, std=0.01, bias=0 

auxiliary_head.conv_seg.bias - torch.Size([150]): 
NormalInit: mean=0, std=0.01, bias=0 

auxiliary_head.convs.0.conv.weight - torch.Size([256, 256, 3, 3]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

auxiliary_head.convs.0.bn.weight - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

auxiliary_head.convs.0.bn.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

embedding_table.weight - torch.Size([151, 256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

transform.conv.weight - torch.Size([256, 512, 1, 1]): 
Initialized by user-defined `init_weights` in ConvModule  

transform.conv.bias - torch.Size([256]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

time_mlp.0.weights - torch.Size([8]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

time_mlp.1.weight - torch.Size([1024, 17]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

time_mlp.1.bias - torch.Size([1024]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

time_mlp.3.weight - torch.Size([1024, 1024]): 
The value is the same before and after calling `init_weights` of DiffSegV20  

time_mlp.3.bias - torch.Size([1024]): 
The value is the same before and after calling `init_weights` of DiffSegV20  
2023-02-19 04:03:42,042 - mmseg - INFO - DiffSegV20(
  (backbone): SwinTransformer(
    (patch_embed): PatchEmbed(
      (adap_padding): AdaptivePadding()
      (projection): Conv2d(3, 192, kernel_size=(4, 4), stride=(4, 4))
      (norm): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
    )
    (drop_after_pos): Dropout(p=0.0, inplace=False)
    (stages): ModuleList(
      (0): SwinBlockSequence(
        (blocks): ModuleList(
          (0): SwinBlock(
            (norm1): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=192, out_features=576, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=192, out_features=192, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=192, out_features=768, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=768, out_features=192, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (1): SwinBlock(
            (norm1): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=192, out_features=576, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=192, out_features=192, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=192, out_features=768, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=768, out_features=192, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
        )
        (downsample): PatchMerging(
          (adap_padding): AdaptivePadding()
          (sampler): Unfold(kernel_size=(2, 2), dilation=(1, 1), padding=(0, 0), stride=(2, 2))
          (norm): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
          (reduction): Linear(in_features=768, out_features=384, bias=False)
        )
      )
      (1): SwinBlockSequence(
        (blocks): ModuleList(
          (0): SwinBlock(
            (norm1): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=384, out_features=1152, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=384, out_features=384, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=384, out_features=1536, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=1536, out_features=384, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (1): SwinBlock(
            (norm1): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=384, out_features=1152, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=384, out_features=384, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=384, out_features=1536, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=1536, out_features=384, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
        )
        (downsample): PatchMerging(
          (adap_padding): AdaptivePadding()
          (sampler): Unfold(kernel_size=(2, 2), dilation=(1, 1), padding=(0, 0), stride=(2, 2))
          (norm): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
          (reduction): Linear(in_features=1536, out_features=768, bias=False)
        )
      )
      (2): SwinBlockSequence(
        (blocks): ModuleList(
          (0): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (1): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (2): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (3): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (4): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (5): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (6): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (7): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (8): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (9): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (10): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (11): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (12): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (13): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (14): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (15): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (16): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (17): SwinBlock(
            (norm1): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=768, out_features=2304, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=768, out_features=768, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=768, out_features=3072, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=3072, out_features=768, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
        )
        (downsample): PatchMerging(
          (adap_padding): AdaptivePadding()
          (sampler): Unfold(kernel_size=(2, 2), dilation=(1, 1), padding=(0, 0), stride=(2, 2))
          (norm): LayerNorm((3072,), eps=1e-05, elementwise_affine=True)
          (reduction): Linear(in_features=3072, out_features=1536, bias=False)
        )
      )
      (3): SwinBlockSequence(
        (blocks): ModuleList(
          (0): SwinBlock(
            (norm1): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=1536, out_features=4608, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=1536, out_features=1536, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=1536, out_features=6144, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=6144, out_features=1536, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
          (1): SwinBlock(
            (norm1): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
            (attn): ShiftWindowMSA(
              (w_msa): WindowMSA(
                (qkv): Linear(in_features=1536, out_features=4608, bias=True)
                (attn_drop): Dropout(p=0.0, inplace=False)
                (proj): Linear(in_features=1536, out_features=1536, bias=True)
                (proj_drop): Dropout(p=0.0, inplace=False)
                (softmax): Softmax(dim=-1)
              )
              (drop): DropPath()
            )
            (norm2): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
            (ffn): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=1536, out_features=6144, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=6144, out_features=1536, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): DropPath()
            )
          )
        )
      )
    )
    (norm0): LayerNorm((192,), eps=1e-05, elementwise_affine=True)
    (norm1): LayerNorm((384,), eps=1e-05, elementwise_affine=True)
    (norm2): LayerNorm((768,), eps=1e-05, elementwise_affine=True)
    (norm3): LayerNorm((1536,), eps=1e-05, elementwise_affine=True)
  )
  init_cfg={'type': 'Pretrained', 'checkpoint': 'https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window12_384_22k_20220412-6580f57d.pth'}
  (neck): Sequential(
    (0): FPN(
      (lateral_convs): ModuleList(
        (0): ConvModule(
          (conv): Conv2d(192, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (gn): GroupNorm(32, 256, eps=1e-05, affine=True)
        )
        (1): ConvModule(
          (conv): Conv2d(384, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (gn): GroupNorm(32, 256, eps=1e-05, affine=True)
        )
        (2): ConvModule(
          (conv): Conv2d(768, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (gn): GroupNorm(32, 256, eps=1e-05, affine=True)
        )
        (3): ConvModule(
          (conv): Conv2d(1536, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (gn): GroupNorm(32, 256, eps=1e-05, affine=True)
        )
      )
      (fpn_convs): ModuleList(
        (0): ConvModule(
          (conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (gn): GroupNorm(32, 256, eps=1e-05, affine=True)
        )
        (1): ConvModule(
          (conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (gn): GroupNorm(32, 256, eps=1e-05, affine=True)
        )
        (2): ConvModule(
          (conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (gn): GroupNorm(32, 256, eps=1e-05, affine=True)
        )
        (3): ConvModule(
          (conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (gn): GroupNorm(32, 256, eps=1e-05, affine=True)
        )
      )
    )
    init_cfg={'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'}
    (1): MultiStageMerging(
      (down): ConvModule(
        (conv): Conv2d(1024, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
        (gn): GroupNorm(32, 256, eps=1e-05, affine=True)
      )
    )
    init_cfg={'type': 'Xavier', 'layer': 'Conv2d', 'distribution': 'uniform'}
  )
  (decode_head): DeformableHeadWithTime(
    input_transform=multiple_select, ignore_index=255, align_corners=False
    (loss_decode): CrossEntropyLoss(avg_non_ignore=False)
    (conv_seg): Conv2d(256, 150, kernel_size=(1, 1), stride=(1, 1))
    (encoder): DetrTransformerEncoder(
      (layers): ModuleList(
        (0): BaseTransformerLayer(
          (attentions): ModuleList(
            (0): MultiScaleDeformableAttention(
              (dropout): Dropout(p=0.0, inplace=False)
              (sampling_offsets): Linear(in_features=256, out_features=64, bias=True)
              (attention_weights): Linear(in_features=256, out_features=32, bias=True)
              (value_proj): Linear(in_features=256, out_features=256, bias=True)
              (output_proj): Linear(in_features=256, out_features=256, bias=True)
            )
          )
          (time_mlp): Sequential(
            (0): SiLU()
            (1): Linear(in_features=1024, out_features=512, bias=True)
          )
          (ffns): ModuleList(
            (0): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=256, out_features=1024, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=1024, out_features=256, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): Identity()
            )
          )
          (norms): ModuleList(
            (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
            (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
          )
        )
        (1): BaseTransformerLayer(
          (attentions): ModuleList(
            (0): MultiScaleDeformableAttention(
              (dropout): Dropout(p=0.0, inplace=False)
              (sampling_offsets): Linear(in_features=256, out_features=64, bias=True)
              (attention_weights): Linear(in_features=256, out_features=32, bias=True)
              (value_proj): Linear(in_features=256, out_features=256, bias=True)
              (output_proj): Linear(in_features=256, out_features=256, bias=True)
            )
          )
          (time_mlp): Sequential(
            (0): SiLU()
            (1): Linear(in_features=1024, out_features=512, bias=True)
          )
          (ffns): ModuleList(
            (0): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=256, out_features=1024, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=1024, out_features=256, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): Identity()
            )
          )
          (norms): ModuleList(
            (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
            (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
          )
        )
        (2): BaseTransformerLayer(
          (attentions): ModuleList(
            (0): MultiScaleDeformableAttention(
              (dropout): Dropout(p=0.0, inplace=False)
              (sampling_offsets): Linear(in_features=256, out_features=64, bias=True)
              (attention_weights): Linear(in_features=256, out_features=32, bias=True)
              (value_proj): Linear(in_features=256, out_features=256, bias=True)
              (output_proj): Linear(in_features=256, out_features=256, bias=True)
            )
          )
          (time_mlp): Sequential(
            (0): SiLU()
            (1): Linear(in_features=1024, out_features=512, bias=True)
          )
          (ffns): ModuleList(
            (0): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=256, out_features=1024, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=1024, out_features=256, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): Identity()
            )
          )
          (norms): ModuleList(
            (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
            (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
          )
        )
        (3): BaseTransformerLayer(
          (attentions): ModuleList(
            (0): MultiScaleDeformableAttention(
              (dropout): Dropout(p=0.0, inplace=False)
              (sampling_offsets): Linear(in_features=256, out_features=64, bias=True)
              (attention_weights): Linear(in_features=256, out_features=32, bias=True)
              (value_proj): Linear(in_features=256, out_features=256, bias=True)
              (output_proj): Linear(in_features=256, out_features=256, bias=True)
            )
          )
          (time_mlp): Sequential(
            (0): SiLU()
            (1): Linear(in_features=1024, out_features=512, bias=True)
          )
          (ffns): ModuleList(
            (0): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=256, out_features=1024, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=1024, out_features=256, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): Identity()
            )
          )
          (norms): ModuleList(
            (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
            (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
          )
        )
        (4): BaseTransformerLayer(
          (attentions): ModuleList(
            (0): MultiScaleDeformableAttention(
              (dropout): Dropout(p=0.0, inplace=False)
              (sampling_offsets): Linear(in_features=256, out_features=64, bias=True)
              (attention_weights): Linear(in_features=256, out_features=32, bias=True)
              (value_proj): Linear(in_features=256, out_features=256, bias=True)
              (output_proj): Linear(in_features=256, out_features=256, bias=True)
            )
          )
          (time_mlp): Sequential(
            (0): SiLU()
            (1): Linear(in_features=1024, out_features=512, bias=True)
          )
          (ffns): ModuleList(
            (0): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=256, out_features=1024, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=1024, out_features=256, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): Identity()
            )
          )
          (norms): ModuleList(
            (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
            (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
          )
        )
        (5): BaseTransformerLayer(
          (attentions): ModuleList(
            (0): MultiScaleDeformableAttention(
              (dropout): Dropout(p=0.0, inplace=False)
              (sampling_offsets): Linear(in_features=256, out_features=64, bias=True)
              (attention_weights): Linear(in_features=256, out_features=32, bias=True)
              (value_proj): Linear(in_features=256, out_features=256, bias=True)
              (output_proj): Linear(in_features=256, out_features=256, bias=True)
            )
          )
          (time_mlp): Sequential(
            (0): SiLU()
            (1): Linear(in_features=1024, out_features=512, bias=True)
          )
          (ffns): ModuleList(
            (0): FFN(
              (activate): GELU()
              (layers): Sequential(
                (0): Sequential(
                  (0): Linear(in_features=256, out_features=1024, bias=True)
                  (1): GELU()
                  (2): Dropout(p=0.0, inplace=False)
                )
                (1): Linear(in_features=1024, out_features=256, bias=True)
                (2): Dropout(p=0.0, inplace=False)
              )
              (dropout_layer): Identity()
            )
          )
          (norms): ModuleList(
            (0): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
            (1): LayerNorm((256,), eps=1e-05, elementwise_affine=True)
          )
        )
      )
    )
    (positional_encoding): SinePositionalEncoding(num_feats=128, temperature=10000, normalize=True, scale=6.283185307179586, eps=1e-06)
  )
  init_cfg={'type': 'Normal', 'std': 0.01, 'override': {'name': 'conv_seg'}}
  (auxiliary_head): FCNHead(
    input_transform=None, ignore_index=255, align_corners=False
    (loss_decode): CrossEntropyLoss(avg_non_ignore=False)
    (conv_seg): Conv2d(256, 150, kernel_size=(1, 1), stride=(1, 1))
    (dropout): Dropout2d(p=0.1, inplace=False)
    (convs): Sequential(
      (0): ConvModule(
        (conv): Conv2d(256, 256, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
        (bn): SyncBatchNorm(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        (activate): ReLU(inplace=True)
      )
    )
  )
  init_cfg={'type': 'Normal', 'std': 0.01, 'override': {'name': 'conv_seg'}}
  (embedding_table): Embedding(151, 256)
  (transform): ConvModule(
    (conv): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1))
  )
  (time_mlp): Sequential(
    (0): LearnedSinusoidalPosEmb()
    (1): Linear(in_features=17, out_features=1024, bias=True)
    (2): GELU()
    (3): Linear(in_features=1024, out_features=1024, bias=True)
  )
)
2023-02-19 04:03:42,053 - mmseg - INFO - Model size:778.62
2023-02-19 04:03:42,356 - mmseg - INFO - Loaded 20210 images
2023-02-19 04:03:43,213 - mmseg - INFO - Loaded 2000 images
2023-02-19 04:03:43,523 - mmseg - INFO - Hooks will be executed in the following order:
before_run:
(VERY_HIGH   ) PolyLrUpdaterHook                  
(NORMAL      ) CheckpointHook                     
(LOW         ) DistEvalHook                       
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
before_train_epoch:
(VERY_HIGH   ) PolyLrUpdaterHook                  
(LOW         ) IterTimerHook                      
(LOW         ) DistEvalHook                       
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
before_train_iter:
(VERY_HIGH   ) PolyLrUpdaterHook                  
(LOW         ) IterTimerHook                      
(LOW         ) DistEvalHook                       
 -------------------- 
after_train_iter:
(ABOVE_NORMAL) OptimizerHook                      
(NORMAL      ) CheckpointHook                     
(LOW         ) IterTimerHook                      
(LOW         ) DistEvalHook                       
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
after_train_epoch:
(NORMAL      ) CheckpointHook                     
(LOW         ) DistEvalHook                       
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
before_val_epoch:
(LOW         ) IterTimerHook                      
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
before_val_iter:
(LOW         ) IterTimerHook                      
 -------------------- 
after_val_iter:
(LOW         ) IterTimerHook                      
 -------------------- 
after_val_epoch:
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
after_run:
(VERY_LOW    ) TextLoggerHook                     
 -------------------- 
2023-02-19 04:03:43,524 - mmseg - INFO - workflow: [('train', 1)], max: 160000 iters
2023-02-19 04:04:05,544 - mmseg - INFO - Iter [50/160000]	lr: 1.959e-06, eta: 13:51:59, time: 0.312, data_time: 0.005, memory: 15214, decode.loss_ce: 4.6176, decode.acc_seg: 0.5870, aux.loss_ce: 1.6120, aux.acc_seg: 0.4375, loss: 6.2295, grad_norm: 16.1646
2023-02-19 04:04:19,457 - mmseg - INFO - Iter [100/160000]	lr: 3.958e-06, eta: 13:06:40, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 3.5798, decode.acc_seg: 13.1671, aux.loss_ce: 1.6045, aux.acc_seg: 0.6935, loss: 5.1843, grad_norm: 12.6668
2023-02-19 04:04:33,355 - mmseg - INFO - Iter [150/160000]	lr: 5.955e-06, eta: 12:51:06, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 3.0136, decode.acc_seg: 24.7743, aux.loss_ce: 1.6124, aux.acc_seg: 2.2721, loss: 4.6261, grad_norm: 9.6024
2023-02-19 04:04:47,585 - mmseg - INFO - Iter [200/160000]	lr: 7.950e-06, eta: 12:47:38, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 2.5400, decode.acc_seg: 36.3597, aux.loss_ce: 1.5634, aux.acc_seg: 9.9345, loss: 4.1034, grad_norm: 9.9188
2023-02-19 04:05:01,461 - mmseg - INFO - Iter [250/160000]	lr: 9.945e-06, eta: 12:41:41, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 2.1533, decode.acc_seg: 46.8199, aux.loss_ce: 1.5501, aux.acc_seg: 26.8242, loss: 3.7035, grad_norm: 10.1419
2023-02-19 04:05:15,394 - mmseg - INFO - Iter [300/160000]	lr: 1.194e-05, eta: 12:38:10, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 1.7917, decode.acc_seg: 56.0662, aux.loss_ce: 1.4835, aux.acc_seg: 41.9214, loss: 3.2753, grad_norm: 9.8253
2023-02-19 04:05:29,412 - mmseg - INFO - Iter [350/160000]	lr: 1.393e-05, eta: 12:36:00, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 1.6385, decode.acc_seg: 58.3709, aux.loss_ce: 1.4176, aux.acc_seg: 45.3586, loss: 3.0562, grad_norm: 9.2588
2023-02-19 04:05:44,788 - mmseg - INFO - Iter [400/160000]	lr: 1.592e-05, eta: 12:43:46, time: 0.308, data_time: 0.005, memory: 15214, decode.loss_ce: 1.4713, decode.acc_seg: 62.4882, aux.loss_ce: 1.3532, aux.acc_seg: 49.8817, loss: 2.8245, grad_norm: 9.1530
2023-02-19 04:05:58,647 - mmseg - INFO - Iter [450/160000]	lr: 1.791e-05, eta: 12:40:33, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 1.3982, decode.acc_seg: 62.7354, aux.loss_ce: 1.2908, aux.acc_seg: 50.3450, loss: 2.6890, grad_norm: 11.6549
2023-02-19 04:06:12,465 - mmseg - INFO - Iter [500/160000]	lr: 1.990e-05, eta: 12:37:44, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 1.2632, decode.acc_seg: 65.2759, aux.loss_ce: 1.2181, aux.acc_seg: 52.0753, loss: 2.4813, grad_norm: 8.9392
2023-02-19 04:06:26,614 - mmseg - INFO - Iter [550/160000]	lr: 2.188e-05, eta: 12:37:00, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 1.2660, decode.acc_seg: 65.0744, aux.loss_ce: 1.1537, aux.acc_seg: 52.4811, loss: 2.4196, grad_norm: 9.7015
2023-02-19 04:06:40,597 - mmseg - INFO - Iter [600/160000]	lr: 2.387e-05, eta: 12:35:37, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 1.1524, decode.acc_seg: 66.9273, aux.loss_ce: 1.0865, aux.acc_seg: 54.6636, loss: 2.2389, grad_norm: 8.9352
2023-02-19 04:06:54,731 - mmseg - INFO - Iter [650/160000]	lr: 2.585e-05, eta: 12:34:54, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 1.0936, decode.acc_seg: 67.6582, aux.loss_ce: 1.0236, aux.acc_seg: 55.2001, loss: 2.1172, grad_norm: 9.1459
2023-02-19 04:07:08,568 - mmseg - INFO - Iter [700/160000]	lr: 2.784e-05, eta: 12:33:21, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 1.1324, decode.acc_seg: 67.3122, aux.loss_ce: 1.0062, aux.acc_seg: 55.5568, loss: 2.1386, grad_norm: 10.5368
2023-02-19 04:07:22,701 - mmseg - INFO - Iter [750/160000]	lr: 2.982e-05, eta: 12:32:49, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 1.0639, decode.acc_seg: 68.4342, aux.loss_ce: 0.9449, aux.acc_seg: 55.1017, loss: 2.0088, grad_norm: 8.0060
2023-02-19 04:07:36,888 - mmseg - INFO - Iter [800/160000]	lr: 3.180e-05, eta: 12:32:42, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 1.0377, decode.acc_seg: 68.5513, aux.loss_ce: 0.8865, aux.acc_seg: 56.5419, loss: 1.9242, grad_norm: 9.3954
2023-02-19 04:07:50,840 - mmseg - INFO - Iter [850/160000]	lr: 3.378e-05, eta: 12:31:44, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.9809, decode.acc_seg: 69.6576, aux.loss_ce: 0.8391, aux.acc_seg: 57.8523, loss: 1.8199, grad_norm: 10.4314
2023-02-19 04:08:05,629 - mmseg - INFO - Iter [900/160000]	lr: 3.576e-05, eta: 12:33:13, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.9269, decode.acc_seg: 70.9267, aux.loss_ce: 0.8037, aux.acc_seg: 57.0601, loss: 1.7306, grad_norm: 9.2592
2023-02-19 04:08:19,916 - mmseg - INFO - Iter [950/160000]	lr: 3.773e-05, eta: 12:33:18, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.9529, decode.acc_seg: 69.9417, aux.loss_ce: 0.7841, aux.acc_seg: 57.1192, loss: 1.7370, grad_norm: 7.6900
2023-02-19 04:08:33,896 - mmseg - INFO - Saving checkpoint at 1000 iterations
2023-02-19 04:08:37,067 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:08:37,067 - mmseg - INFO - Iter [1000/160000]	lr: 3.971e-05, eta: 12:40:53, time: 0.343, data_time: 0.005, memory: 15214, decode.loss_ce: 0.9063, decode.acc_seg: 70.8045, aux.loss_ce: 0.7178, aux.acc_seg: 60.3444, loss: 1.6240, grad_norm: 8.2720
2023-02-19 04:08:51,706 - mmseg - INFO - Iter [1050/160000]	lr: 4.168e-05, eta: 12:41:15, time: 0.292, data_time: 0.004, memory: 15214, decode.loss_ce: 0.9593, decode.acc_seg: 69.4465, aux.loss_ce: 0.7323, aux.acc_seg: 59.0828, loss: 1.6916, grad_norm: 9.0100
2023-02-19 04:09:05,889 - mmseg - INFO - Iter [1100/160000]	lr: 4.366e-05, eta: 12:40:39, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.9162, decode.acc_seg: 70.4854, aux.loss_ce: 0.7158, aux.acc_seg: 59.3944, loss: 1.6320, grad_norm: 8.0269
2023-02-19 04:09:20,321 - mmseg - INFO - Iter [1150/160000]	lr: 4.563e-05, eta: 12:40:30, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.9211, decode.acc_seg: 71.1563, aux.loss_ce: 0.6838, aux.acc_seg: 60.5913, loss: 1.6049, grad_norm: 8.9189
2023-02-19 04:09:34,619 - mmseg - INFO - Iter [1200/160000]	lr: 4.760e-05, eta: 12:40:11, time: 0.287, data_time: 0.006, memory: 15214, decode.loss_ce: 0.8598, decode.acc_seg: 72.3752, aux.loss_ce: 0.6282, aux.acc_seg: 63.0879, loss: 1.4880, grad_norm: 9.3403
2023-02-19 04:09:48,434 - mmseg - INFO - Iter [1250/160000]	lr: 4.957e-05, eta: 12:38:44, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.9897, decode.acc_seg: 68.9295, aux.loss_ce: 0.6456, aux.acc_seg: 60.7235, loss: 1.6353, grad_norm: 10.6260
2023-02-19 04:10:04,749 - mmseg - INFO - Iter [1300/160000]	lr: 5.154e-05, eta: 12:42:34, time: 0.327, data_time: 0.049, memory: 15214, decode.loss_ce: 0.8661, decode.acc_seg: 71.1490, aux.loss_ce: 0.5970, aux.acc_seg: 63.6253, loss: 1.4631, grad_norm: 9.4742
2023-02-19 04:10:18,644 - mmseg - INFO - Iter [1350/160000]	lr: 5.351e-05, eta: 12:41:19, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.8316, decode.acc_seg: 72.4683, aux.loss_ce: 0.5707, aux.acc_seg: 64.4645, loss: 1.4023, grad_norm: 7.5594
2023-02-19 04:10:32,346 - mmseg - INFO - Iter [1400/160000]	lr: 5.547e-05, eta: 12:39:46, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.8632, decode.acc_seg: 72.1430, aux.loss_ce: 0.5660, aux.acc_seg: 64.5516, loss: 1.4291, grad_norm: 8.7594
2023-02-19 04:10:46,865 - mmseg - INFO - Iter [1450/160000]	lr: 5.744e-05, eta: 12:39:47, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.8259, decode.acc_seg: 72.0220, aux.loss_ce: 0.5431, aux.acc_seg: 66.2275, loss: 1.3690, grad_norm: 7.9872
2023-02-19 04:11:00,677 - mmseg - INFO - Iter [1500/160000]	lr: 5.940e-05, eta: 12:38:33, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.8153, decode.acc_seg: 73.0635, aux.loss_ce: 0.5242, aux.acc_seg: 66.7169, loss: 1.3395, grad_norm: 6.5510
2023-02-19 04:11:14,483 - mmseg - INFO - Iter [1550/160000]	lr: 5.942e-05, eta: 12:37:23, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.8099, decode.acc_seg: 72.8524, aux.loss_ce: 0.5024, aux.acc_seg: 67.4492, loss: 1.3123, grad_norm: 7.1674
2023-02-19 04:11:28,192 - mmseg - INFO - Iter [1600/160000]	lr: 5.940e-05, eta: 12:36:06, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.8660, decode.acc_seg: 71.0133, aux.loss_ce: 0.5276, aux.acc_seg: 65.1900, loss: 1.3936, grad_norm: 7.7979
2023-02-19 04:11:42,069 - mmseg - INFO - Iter [1650/160000]	lr: 5.938e-05, eta: 12:35:09, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7871, decode.acc_seg: 73.0413, aux.loss_ce: 0.4922, aux.acc_seg: 67.3698, loss: 1.2793, grad_norm: 6.5483
2023-02-19 04:11:55,779 - mmseg - INFO - Iter [1700/160000]	lr: 5.936e-05, eta: 12:33:59, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7912, decode.acc_seg: 72.9833, aux.loss_ce: 0.4589, aux.acc_seg: 69.2988, loss: 1.2501, grad_norm: 8.5577
2023-02-19 04:12:09,449 - mmseg - INFO - Iter [1750/160000]	lr: 5.934e-05, eta: 12:32:49, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7940, decode.acc_seg: 72.3268, aux.loss_ce: 0.4750, aux.acc_seg: 67.3248, loss: 1.2690, grad_norm: 7.6273
2023-02-19 04:12:23,173 - mmseg - INFO - Iter [1800/160000]	lr: 5.933e-05, eta: 12:31:46, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.8077, decode.acc_seg: 72.7836, aux.loss_ce: 0.4738, aux.acc_seg: 68.9028, loss: 1.2814, grad_norm: 7.1944
2023-02-19 04:12:36,887 - mmseg - INFO - Iter [1850/160000]	lr: 5.931e-05, eta: 12:30:46, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7577, decode.acc_seg: 74.1655, aux.loss_ce: 0.4485, aux.acc_seg: 68.9957, loss: 1.2062, grad_norm: 7.4086
2023-02-19 04:12:50,776 - mmseg - INFO - Iter [1900/160000]	lr: 5.929e-05, eta: 12:30:02, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7648, decode.acc_seg: 73.6627, aux.loss_ce: 0.4422, aux.acc_seg: 69.4201, loss: 1.2070, grad_norm: 6.6059
2023-02-19 04:13:05,173 - mmseg - INFO - Iter [1950/160000]	lr: 5.927e-05, eta: 12:30:01, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7592, decode.acc_seg: 73.3597, aux.loss_ce: 0.4473, aux.acc_seg: 68.4411, loss: 1.2066, grad_norm: 6.6353
2023-02-19 04:13:18,871 - mmseg - INFO - Saving checkpoint at 2000 iterations
2023-02-19 04:13:22,227 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:13:22,227 - mmseg - INFO - Iter [2000/160000]	lr: 5.925e-05, eta: 12:33:30, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.8163, decode.acc_seg: 71.3371, aux.loss_ce: 0.4519, aux.acc_seg: 67.4811, loss: 1.2683, grad_norm: 7.0369
2023-02-19 04:13:36,043 - mmseg - INFO - Iter [2050/160000]	lr: 5.923e-05, eta: 12:32:38, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.8164, decode.acc_seg: 73.2610, aux.loss_ce: 0.4532, aux.acc_seg: 68.7917, loss: 1.2696, grad_norm: 7.7308
2023-02-19 04:13:49,993 - mmseg - INFO - Iter [2100/160000]	lr: 5.921e-05, eta: 12:31:57, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.8034, decode.acc_seg: 72.1956, aux.loss_ce: 0.4414, aux.acc_seg: 68.2517, loss: 1.2448, grad_norm: 6.3926
2023-02-19 04:14:03,621 - mmseg - INFO - Iter [2150/160000]	lr: 5.919e-05, eta: 12:30:55, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.7893, decode.acc_seg: 72.9019, aux.loss_ce: 0.4403, aux.acc_seg: 67.6491, loss: 1.2296, grad_norm: 6.5839
2023-02-19 04:14:18,048 - mmseg - INFO - Iter [2200/160000]	lr: 5.918e-05, eta: 12:30:51, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.7680, decode.acc_seg: 73.6167, aux.loss_ce: 0.4223, aux.acc_seg: 69.9345, loss: 1.1903, grad_norm: 7.2036
2023-02-19 04:14:32,170 - mmseg - INFO - Iter [2250/160000]	lr: 5.916e-05, eta: 12:30:27, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.7849, decode.acc_seg: 72.4819, aux.loss_ce: 0.4323, aux.acc_seg: 68.4742, loss: 1.2171, grad_norm: 9.6680
2023-02-19 04:14:46,253 - mmseg - INFO - Iter [2300/160000]	lr: 5.914e-05, eta: 12:29:59, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7761, decode.acc_seg: 73.2031, aux.loss_ce: 0.4076, aux.acc_seg: 69.7728, loss: 1.1837, grad_norm: 7.1106
2023-02-19 04:15:00,378 - mmseg - INFO - Iter [2350/160000]	lr: 5.912e-05, eta: 12:29:33, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7548, decode.acc_seg: 72.9492, aux.loss_ce: 0.4092, aux.acc_seg: 69.1884, loss: 1.1641, grad_norm: 6.8615
2023-02-19 04:15:14,451 - mmseg - INFO - Iter [2400/160000]	lr: 5.910e-05, eta: 12:29:08, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.7362, decode.acc_seg: 74.7498, aux.loss_ce: 0.3847, aux.acc_seg: 71.7075, loss: 1.1209, grad_norm: 6.8210
2023-02-19 04:15:28,090 - mmseg - INFO - Iter [2450/160000]	lr: 5.908e-05, eta: 12:28:14, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.8361, decode.acc_seg: 70.8703, aux.loss_ce: 0.4258, aux.acc_seg: 68.3381, loss: 1.2619, grad_norm: 7.6220
2023-02-19 04:15:42,063 - mmseg - INFO - Iter [2500/160000]	lr: 5.906e-05, eta: 12:27:42, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7589, decode.acc_seg: 73.1000, aux.loss_ce: 0.4058, aux.acc_seg: 69.2607, loss: 1.1648, grad_norm: 6.6631
2023-02-19 04:15:58,034 - mmseg - INFO - Iter [2550/160000]	lr: 5.904e-05, eta: 12:29:15, time: 0.319, data_time: 0.046, memory: 15214, decode.loss_ce: 0.7265, decode.acc_seg: 73.5806, aux.loss_ce: 0.3774, aux.acc_seg: 71.3542, loss: 1.1038, grad_norm: 6.4313
2023-02-19 04:16:11,612 - mmseg - INFO - Iter [2600/160000]	lr: 5.903e-05, eta: 12:28:18, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7033, decode.acc_seg: 75.4703, aux.loss_ce: 0.3882, aux.acc_seg: 71.3563, loss: 1.0914, grad_norm: 6.9518
2023-02-19 04:16:25,559 - mmseg - INFO - Iter [2650/160000]	lr: 5.901e-05, eta: 12:27:45, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7092, decode.acc_seg: 75.0415, aux.loss_ce: 0.3762, aux.acc_seg: 72.2415, loss: 1.0854, grad_norm: 5.6267
2023-02-19 04:16:39,637 - mmseg - INFO - Iter [2700/160000]	lr: 5.899e-05, eta: 12:27:21, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6852, decode.acc_seg: 75.3843, aux.loss_ce: 0.3669, aux.acc_seg: 71.9444, loss: 1.0521, grad_norm: 7.8574
2023-02-19 04:16:53,767 - mmseg - INFO - Iter [2750/160000]	lr: 5.897e-05, eta: 12:26:59, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6890, decode.acc_seg: 75.2542, aux.loss_ce: 0.3687, aux.acc_seg: 71.7087, loss: 1.0577, grad_norm: 5.9514
2023-02-19 04:17:08,389 - mmseg - INFO - Iter [2800/160000]	lr: 5.895e-05, eta: 12:27:06, time: 0.292, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7274, decode.acc_seg: 74.2414, aux.loss_ce: 0.3627, aux.acc_seg: 71.6713, loss: 1.0901, grad_norm: 5.8784
2023-02-19 04:17:22,080 - mmseg - INFO - Iter [2850/160000]	lr: 5.893e-05, eta: 12:26:20, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.7058, decode.acc_seg: 75.3759, aux.loss_ce: 0.3619, aux.acc_seg: 72.3678, loss: 1.0677, grad_norm: 7.4144
2023-02-19 04:17:35,877 - mmseg - INFO - Iter [2900/160000]	lr: 5.891e-05, eta: 12:25:42, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7140, decode.acc_seg: 75.3286, aux.loss_ce: 0.3652, aux.acc_seg: 72.3506, loss: 1.0792, grad_norm: 7.4007
2023-02-19 04:17:50,063 - mmseg - INFO - Iter [2950/160000]	lr: 5.889e-05, eta: 12:25:25, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7126, decode.acc_seg: 75.8356, aux.loss_ce: 0.3661, aux.acc_seg: 72.6573, loss: 1.0787, grad_norm: 6.7681
2023-02-19 04:18:04,120 - mmseg - INFO - Saving checkpoint at 3000 iterations
2023-02-19 04:18:07,353 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:18:07,353 - mmseg - INFO - Iter [3000/160000]	lr: 5.888e-05, eta: 12:27:50, time: 0.346, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6906, decode.acc_seg: 75.4149, aux.loss_ce: 0.3517, aux.acc_seg: 72.2259, loss: 1.0423, grad_norm: 5.5512
2023-02-19 04:18:21,016 - mmseg - INFO - Iter [3050/160000]	lr: 5.886e-05, eta: 12:27:04, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7474, decode.acc_seg: 74.0683, aux.loss_ce: 0.3672, aux.acc_seg: 71.9143, loss: 1.1147, grad_norm: 6.9374
2023-02-19 04:18:34,610 - mmseg - INFO - Iter [3100/160000]	lr: 5.884e-05, eta: 12:26:14, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7017, decode.acc_seg: 75.3537, aux.loss_ce: 0.3547, aux.acc_seg: 72.6224, loss: 1.0563, grad_norm: 5.5874
2023-02-19 04:18:48,548 - mmseg - INFO - Iter [3150/160000]	lr: 5.882e-05, eta: 12:25:44, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7171, decode.acc_seg: 74.5180, aux.loss_ce: 0.3499, aux.acc_seg: 72.5810, loss: 1.0670, grad_norm: 5.8649
2023-02-19 04:19:02,428 - mmseg - INFO - Iter [3200/160000]	lr: 5.880e-05, eta: 12:25:11, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7019, decode.acc_seg: 75.2299, aux.loss_ce: 0.3504, aux.acc_seg: 72.1580, loss: 1.0523, grad_norm: 5.9925
2023-02-19 04:19:16,399 - mmseg - INFO - Iter [3250/160000]	lr: 5.878e-05, eta: 12:24:43, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6994, decode.acc_seg: 75.5654, aux.loss_ce: 0.3456, aux.acc_seg: 73.2162, loss: 1.0450, grad_norm: 6.4579
2023-02-19 04:19:30,528 - mmseg - INFO - Iter [3300/160000]	lr: 5.876e-05, eta: 12:24:22, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6596, decode.acc_seg: 75.7799, aux.loss_ce: 0.3269, aux.acc_seg: 73.2128, loss: 0.9865, grad_norm: 5.0987
2023-02-19 04:19:45,249 - mmseg - INFO - Iter [3350/160000]	lr: 5.874e-05, eta: 12:24:30, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6792, decode.acc_seg: 75.8049, aux.loss_ce: 0.3295, aux.acc_seg: 73.8557, loss: 1.0087, grad_norm: 6.2236
2023-02-19 04:19:59,372 - mmseg - INFO - Iter [3400/160000]	lr: 5.873e-05, eta: 12:24:10, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6357, decode.acc_seg: 76.5873, aux.loss_ce: 0.3025, aux.acc_seg: 74.9052, loss: 0.9382, grad_norm: 5.2693
2023-02-19 04:20:13,516 - mmseg - INFO - Iter [3450/160000]	lr: 5.871e-05, eta: 12:23:50, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6537, decode.acc_seg: 76.6667, aux.loss_ce: 0.3148, aux.acc_seg: 74.7427, loss: 0.9684, grad_norm: 6.0273
2023-02-19 04:20:27,423 - mmseg - INFO - Iter [3500/160000]	lr: 5.869e-05, eta: 12:23:19, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6793, decode.acc_seg: 75.6162, aux.loss_ce: 0.3253, aux.acc_seg: 73.8341, loss: 1.0047, grad_norm: 5.6310
2023-02-19 04:20:41,361 - mmseg - INFO - Iter [3550/160000]	lr: 5.867e-05, eta: 12:22:53, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6696, decode.acc_seg: 76.0026, aux.loss_ce: 0.3245, aux.acc_seg: 73.5517, loss: 0.9940, grad_norm: 6.1337
2023-02-19 04:20:56,564 - mmseg - INFO - Iter [3600/160000]	lr: 5.865e-05, eta: 12:23:20, time: 0.304, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6920, decode.acc_seg: 76.2011, aux.loss_ce: 0.3258, aux.acc_seg: 74.4651, loss: 1.0178, grad_norm: 6.3336
2023-02-19 04:21:10,365 - mmseg - INFO - Iter [3650/160000]	lr: 5.863e-05, eta: 12:22:46, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6871, decode.acc_seg: 75.9417, aux.loss_ce: 0.3229, aux.acc_seg: 74.0429, loss: 1.0100, grad_norm: 6.2694
2023-02-19 04:21:24,972 - mmseg - INFO - Iter [3700/160000]	lr: 5.861e-05, eta: 12:22:47, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6664, decode.acc_seg: 76.0788, aux.loss_ce: 0.3145, aux.acc_seg: 74.4879, loss: 0.9808, grad_norm: 5.6447
2023-02-19 04:21:38,955 - mmseg - INFO - Iter [3750/160000]	lr: 5.859e-05, eta: 12:22:20, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.7050, decode.acc_seg: 75.0273, aux.loss_ce: 0.3369, aux.acc_seg: 72.7193, loss: 1.0419, grad_norm: 6.3509
2023-02-19 04:21:54,928 - mmseg - INFO - Iter [3800/160000]	lr: 5.858e-05, eta: 12:23:17, time: 0.320, data_time: 0.049, memory: 15214, decode.loss_ce: 0.6675, decode.acc_seg: 76.3029, aux.loss_ce: 0.3179, aux.acc_seg: 74.3431, loss: 0.9853, grad_norm: 6.6323
2023-02-19 04:22:08,663 - mmseg - INFO - Iter [3850/160000]	lr: 5.856e-05, eta: 12:22:41, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6514, decode.acc_seg: 76.6392, aux.loss_ce: 0.3110, aux.acc_seg: 74.5827, loss: 0.9624, grad_norm: 6.0549
2023-02-19 04:22:23,239 - mmseg - INFO - Iter [3900/160000]	lr: 5.854e-05, eta: 12:22:38, time: 0.291, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6770, decode.acc_seg: 75.8734, aux.loss_ce: 0.3196, aux.acc_seg: 74.6826, loss: 0.9966, grad_norm: 6.2689
2023-02-19 04:22:37,039 - mmseg - INFO - Iter [3950/160000]	lr: 5.852e-05, eta: 12:22:06, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6394, decode.acc_seg: 76.1664, aux.loss_ce: 0.3008, aux.acc_seg: 75.4483, loss: 0.9402, grad_norm: 7.8063
2023-02-19 04:22:51,028 - mmseg - INFO - Saving checkpoint at 4000 iterations
2023-02-19 04:22:54,255 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:22:54,255 - mmseg - INFO - Iter [4000/160000]	lr: 5.850e-05, eta: 12:23:47, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6107, decode.acc_seg: 77.8427, aux.loss_ce: 0.2849, aux.acc_seg: 76.7705, loss: 0.8956, grad_norm: 5.8948
2023-02-19 04:23:08,174 - mmseg - INFO - Iter [4050/160000]	lr: 5.848e-05, eta: 12:23:18, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5875, decode.acc_seg: 78.2228, aux.loss_ce: 0.2817, aux.acc_seg: 76.6059, loss: 0.8692, grad_norm: 6.0144
2023-02-19 04:23:22,222 - mmseg - INFO - Iter [4100/160000]	lr: 5.846e-05, eta: 12:22:53, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5942, decode.acc_seg: 78.9607, aux.loss_ce: 0.2828, aux.acc_seg: 76.9480, loss: 0.8770, grad_norm: 4.7647
2023-02-19 04:23:36,381 - mmseg - INFO - Iter [4150/160000]	lr: 5.844e-05, eta: 12:22:34, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6480, decode.acc_seg: 77.2824, aux.loss_ce: 0.2996, aux.acc_seg: 75.7040, loss: 0.9476, grad_norm: 5.9429
2023-02-19 04:23:50,743 - mmseg - INFO - Iter [4200/160000]	lr: 5.843e-05, eta: 12:22:23, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5992, decode.acc_seg: 77.2806, aux.loss_ce: 0.2778, aux.acc_seg: 76.0458, loss: 0.8770, grad_norm: 4.8809
2023-02-19 04:24:04,464 - mmseg - INFO - Iter [4250/160000]	lr: 5.841e-05, eta: 12:21:48, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6299, decode.acc_seg: 76.8802, aux.loss_ce: 0.2894, aux.acc_seg: 76.1819, loss: 0.9193, grad_norm: 5.4407
2023-02-19 04:24:18,121 - mmseg - INFO - Iter [4300/160000]	lr: 5.839e-05, eta: 12:21:10, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6488, decode.acc_seg: 77.3176, aux.loss_ce: 0.3022, aux.acc_seg: 75.5528, loss: 0.9510, grad_norm: 5.3391
2023-02-19 04:24:31,769 - mmseg - INFO - Iter [4350/160000]	lr: 5.837e-05, eta: 12:20:33, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6141, decode.acc_seg: 78.0718, aux.loss_ce: 0.2825, aux.acc_seg: 76.9996, loss: 0.8965, grad_norm: 5.5944
2023-02-19 04:24:45,348 - mmseg - INFO - Iter [4400/160000]	lr: 5.835e-05, eta: 12:19:55, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6147, decode.acc_seg: 77.7966, aux.loss_ce: 0.2843, aux.acc_seg: 76.5172, loss: 0.8990, grad_norm: 5.7844
2023-02-19 04:24:59,607 - mmseg - INFO - Iter [4450/160000]	lr: 5.833e-05, eta: 12:19:40, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6637, decode.acc_seg: 76.7688, aux.loss_ce: 0.2991, aux.acc_seg: 75.4599, loss: 0.9628, grad_norm: 6.1457
2023-02-19 04:25:13,807 - mmseg - INFO - Iter [4500/160000]	lr: 5.831e-05, eta: 12:19:24, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5679, decode.acc_seg: 78.6239, aux.loss_ce: 0.2658, aux.acc_seg: 77.4221, loss: 0.8337, grad_norm: 5.0653
2023-02-19 04:25:27,682 - mmseg - INFO - Iter [4550/160000]	lr: 5.829e-05, eta: 12:18:56, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6698, decode.acc_seg: 76.1927, aux.loss_ce: 0.3046, aux.acc_seg: 74.9344, loss: 0.9743, grad_norm: 6.2713
2023-02-19 04:25:41,467 - mmseg - INFO - Iter [4600/160000]	lr: 5.828e-05, eta: 12:18:26, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6218, decode.acc_seg: 77.3669, aux.loss_ce: 0.2777, aux.acc_seg: 76.8180, loss: 0.8995, grad_norm: 6.0047
2023-02-19 04:25:55,818 - mmseg - INFO - Iter [4650/160000]	lr: 5.826e-05, eta: 12:18:15, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6385, decode.acc_seg: 76.5295, aux.loss_ce: 0.2901, aux.acc_seg: 75.8549, loss: 0.9287, grad_norm: 5.3391
2023-02-19 04:26:09,748 - mmseg - INFO - Iter [4700/160000]	lr: 5.824e-05, eta: 12:17:49, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6018, decode.acc_seg: 77.8243, aux.loss_ce: 0.2778, aux.acc_seg: 76.5111, loss: 0.8795, grad_norm: 6.4244
2023-02-19 04:26:23,414 - mmseg - INFO - Iter [4750/160000]	lr: 5.822e-05, eta: 12:17:16, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6127, decode.acc_seg: 76.8390, aux.loss_ce: 0.2741, aux.acc_seg: 75.8166, loss: 0.8868, grad_norm: 6.3588
2023-02-19 04:26:37,842 - mmseg - INFO - Iter [4800/160000]	lr: 5.820e-05, eta: 12:17:08, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6287, decode.acc_seg: 77.1908, aux.loss_ce: 0.2839, aux.acc_seg: 76.8140, loss: 0.9125, grad_norm: 6.7294
2023-02-19 04:26:52,296 - mmseg - INFO - Iter [4850/160000]	lr: 5.818e-05, eta: 12:17:00, time: 0.289, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6160, decode.acc_seg: 77.4913, aux.loss_ce: 0.2867, aux.acc_seg: 76.0734, loss: 0.9027, grad_norm: 5.1120
2023-02-19 04:27:06,634 - mmseg - INFO - Iter [4900/160000]	lr: 5.816e-05, eta: 12:16:48, time: 0.287, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5684, decode.acc_seg: 79.1383, aux.loss_ce: 0.2538, aux.acc_seg: 78.3313, loss: 0.8222, grad_norm: 5.6675
2023-02-19 04:27:20,456 - mmseg - INFO - Iter [4950/160000]	lr: 5.814e-05, eta: 12:16:21, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6151, decode.acc_seg: 77.1944, aux.loss_ce: 0.2758, aux.acc_seg: 76.1234, loss: 0.8910, grad_norm: 6.3856
2023-02-19 04:27:34,156 - mmseg - INFO - Saving checkpoint at 5000 iterations
2023-02-19 04:27:37,385 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:27:37,385 - mmseg - INFO - Iter [5000/160000]	lr: 5.813e-05, eta: 12:17:30, time: 0.339, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6402, decode.acc_seg: 76.1707, aux.loss_ce: 0.2836, aux.acc_seg: 75.0830, loss: 0.9238, grad_norm: 6.1200
2023-02-19 04:27:52,330 - mmseg - INFO - Iter [5050/160000]	lr: 5.811e-05, eta: 12:17:36, time: 0.299, data_time: 0.004, memory: 15214, decode.loss_ce: 0.6063, decode.acc_seg: 78.0031, aux.loss_ce: 0.2643, aux.acc_seg: 77.5477, loss: 0.8707, grad_norm: 6.5962
2023-02-19 04:28:08,354 - mmseg - INFO - Iter [5100/160000]	lr: 5.809e-05, eta: 12:18:14, time: 0.320, data_time: 0.047, memory: 15214, decode.loss_ce: 0.5816, decode.acc_seg: 79.0267, aux.loss_ce: 0.2628, aux.acc_seg: 78.0806, loss: 0.8444, grad_norm: 4.8903
2023-02-19 04:28:22,671 - mmseg - INFO - Iter [5150/160000]	lr: 5.807e-05, eta: 12:18:00, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5861, decode.acc_seg: 78.9608, aux.loss_ce: 0.2677, aux.acc_seg: 77.6010, loss: 0.8538, grad_norm: 6.1048
2023-02-19 04:28:37,489 - mmseg - INFO - Iter [5200/160000]	lr: 5.805e-05, eta: 12:18:01, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5910, decode.acc_seg: 77.5544, aux.loss_ce: 0.2659, aux.acc_seg: 76.9409, loss: 0.8569, grad_norm: 4.9119
2023-02-19 04:28:52,456 - mmseg - INFO - Iter [5250/160000]	lr: 5.803e-05, eta: 12:18:07, time: 0.300, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6154, decode.acc_seg: 77.3397, aux.loss_ce: 0.2665, aux.acc_seg: 77.2071, loss: 0.8819, grad_norm: 5.2920
2023-02-19 04:29:06,677 - mmseg - INFO - Iter [5300/160000]	lr: 5.801e-05, eta: 12:17:50, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5740, decode.acc_seg: 79.0090, aux.loss_ce: 0.2501, aux.acc_seg: 78.7237, loss: 0.8241, grad_norm: 4.7745
2023-02-19 04:29:20,695 - mmseg - INFO - Iter [5350/160000]	lr: 5.799e-05, eta: 12:17:28, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6131, decode.acc_seg: 77.5315, aux.loss_ce: 0.2684, aux.acc_seg: 76.9976, loss: 0.8816, grad_norm: 5.4752
2023-02-19 04:29:34,740 - mmseg - INFO - Iter [5400/160000]	lr: 5.798e-05, eta: 12:17:06, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5946, decode.acc_seg: 78.7046, aux.loss_ce: 0.2630, aux.acc_seg: 77.8312, loss: 0.8577, grad_norm: 5.8990
2023-02-19 04:29:48,800 - mmseg - INFO - Iter [5450/160000]	lr: 5.796e-05, eta: 12:16:45, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5718, decode.acc_seg: 79.0490, aux.loss_ce: 0.2548, aux.acc_seg: 78.0288, loss: 0.8267, grad_norm: 5.3780
2023-02-19 04:30:02,858 - mmseg - INFO - Iter [5500/160000]	lr: 5.794e-05, eta: 12:16:23, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6097, decode.acc_seg: 77.4479, aux.loss_ce: 0.2663, aux.acc_seg: 76.8731, loss: 0.8760, grad_norm: 5.0566
2023-02-19 04:30:16,744 - mmseg - INFO - Iter [5550/160000]	lr: 5.792e-05, eta: 12:15:58, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5802, decode.acc_seg: 78.7976, aux.loss_ce: 0.2566, aux.acc_seg: 77.9590, loss: 0.8368, grad_norm: 5.3645
2023-02-19 04:30:30,705 - mmseg - INFO - Iter [5600/160000]	lr: 5.790e-05, eta: 12:15:34, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5751, decode.acc_seg: 78.3954, aux.loss_ce: 0.2530, aux.acc_seg: 77.8044, loss: 0.8281, grad_norm: 5.5756
2023-02-19 04:30:44,709 - mmseg - INFO - Iter [5650/160000]	lr: 5.788e-05, eta: 12:15:11, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5769, decode.acc_seg: 78.8867, aux.loss_ce: 0.2536, aux.acc_seg: 78.1600, loss: 0.8305, grad_norm: 5.2577
2023-02-19 04:30:59,587 - mmseg - INFO - Iter [5700/160000]	lr: 5.786e-05, eta: 12:15:14, time: 0.298, data_time: 0.006, memory: 15214, decode.loss_ce: 0.5630, decode.acc_seg: 79.1375, aux.loss_ce: 0.2489, aux.acc_seg: 78.3235, loss: 0.8119, grad_norm: 5.7718
2023-02-19 04:31:13,865 - mmseg - INFO - Iter [5750/160000]	lr: 5.784e-05, eta: 12:14:59, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5983, decode.acc_seg: 78.1367, aux.loss_ce: 0.2604, aux.acc_seg: 77.6470, loss: 0.8587, grad_norm: 5.8214
2023-02-19 04:31:27,870 - mmseg - INFO - Iter [5800/160000]	lr: 5.783e-05, eta: 12:14:37, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5547, decode.acc_seg: 79.4321, aux.loss_ce: 0.2415, aux.acc_seg: 78.6616, loss: 0.7962, grad_norm: 6.1445
2023-02-19 04:31:42,381 - mmseg - INFO - Iter [5850/160000]	lr: 5.781e-05, eta: 12:14:28, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5378, decode.acc_seg: 79.5172, aux.loss_ce: 0.2346, aux.acc_seg: 78.6917, loss: 0.7725, grad_norm: 5.3127
2023-02-19 04:31:56,534 - mmseg - INFO - Iter [5900/160000]	lr: 5.779e-05, eta: 12:14:09, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5536, decode.acc_seg: 79.7428, aux.loss_ce: 0.2446, aux.acc_seg: 78.4339, loss: 0.7982, grad_norm: 4.8775
2023-02-19 04:32:10,895 - mmseg - INFO - Iter [5950/160000]	lr: 5.777e-05, eta: 12:13:58, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5782, decode.acc_seg: 78.8357, aux.loss_ce: 0.2540, aux.acc_seg: 77.8047, loss: 0.8322, grad_norm: 6.1428
2023-02-19 04:32:24,762 - mmseg - INFO - Saving checkpoint at 6000 iterations
2023-02-19 04:32:28,062 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:32:28,063 - mmseg - INFO - Iter [6000/160000]	lr: 5.775e-05, eta: 12:14:57, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.6024, decode.acc_seg: 77.8539, aux.loss_ce: 0.2578, aux.acc_seg: 77.2040, loss: 0.8603, grad_norm: 6.1134
2023-02-19 04:32:41,991 - mmseg - INFO - Iter [6050/160000]	lr: 5.773e-05, eta: 12:14:33, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5956, decode.acc_seg: 78.4761, aux.loss_ce: 0.2530, aux.acc_seg: 78.2447, loss: 0.8486, grad_norm: 5.3695
2023-02-19 04:32:56,144 - mmseg - INFO - Iter [6100/160000]	lr: 5.771e-05, eta: 12:14:15, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5306, decode.acc_seg: 79.8229, aux.loss_ce: 0.2297, aux.acc_seg: 79.4810, loss: 0.7603, grad_norm: 4.7894
2023-02-19 04:33:09,757 - mmseg - INFO - Iter [6150/160000]	lr: 5.769e-05, eta: 12:13:43, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5571, decode.acc_seg: 79.5442, aux.loss_ce: 0.2390, aux.acc_seg: 78.8589, loss: 0.7961, grad_norm: 5.3460
2023-02-19 04:33:23,821 - mmseg - INFO - Iter [6200/160000]	lr: 5.768e-05, eta: 12:13:22, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5936, decode.acc_seg: 78.2532, aux.loss_ce: 0.2544, aux.acc_seg: 77.8413, loss: 0.8479, grad_norm: 6.4724
2023-02-19 04:33:37,502 - mmseg - INFO - Iter [6250/160000]	lr: 5.766e-05, eta: 12:12:53, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5983, decode.acc_seg: 78.3216, aux.loss_ce: 0.2553, aux.acc_seg: 77.8129, loss: 0.8536, grad_norm: 6.2560
2023-02-19 04:33:52,350 - mmseg - INFO - Iter [6300/160000]	lr: 5.764e-05, eta: 12:12:51, time: 0.296, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5420, decode.acc_seg: 79.8232, aux.loss_ce: 0.2403, aux.acc_seg: 79.0745, loss: 0.7823, grad_norm: 5.6655
2023-02-19 04:34:08,537 - mmseg - INFO - Iter [6350/160000]	lr: 5.762e-05, eta: 12:13:23, time: 0.324, data_time: 0.048, memory: 15214, decode.loss_ce: 0.5366, decode.acc_seg: 79.7654, aux.loss_ce: 0.2348, aux.acc_seg: 79.1556, loss: 0.7714, grad_norm: 7.3473
2023-02-19 04:34:22,497 - mmseg - INFO - Iter [6400/160000]	lr: 5.760e-05, eta: 12:12:59, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5397, decode.acc_seg: 79.4147, aux.loss_ce: 0.2393, aux.acc_seg: 78.2246, loss: 0.7789, grad_norm: 5.2966
2023-02-19 04:34:36,631 - mmseg - INFO - Iter [6450/160000]	lr: 5.758e-05, eta: 12:12:42, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5317, decode.acc_seg: 80.7439, aux.loss_ce: 0.2312, aux.acc_seg: 79.9894, loss: 0.7629, grad_norm: 6.7956
2023-02-19 04:34:50,573 - mmseg - INFO - Iter [6500/160000]	lr: 5.756e-05, eta: 12:12:18, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4941, decode.acc_seg: 81.2027, aux.loss_ce: 0.2173, aux.acc_seg: 80.3858, loss: 0.7114, grad_norm: 5.2393
2023-02-19 04:35:04,235 - mmseg - INFO - Iter [6550/160000]	lr: 5.754e-05, eta: 12:11:49, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5538, decode.acc_seg: 78.9960, aux.loss_ce: 0.2405, aux.acc_seg: 78.3443, loss: 0.7943, grad_norm: 5.4695
2023-02-19 04:35:18,227 - mmseg - INFO - Iter [6600/160000]	lr: 5.753e-05, eta: 12:11:27, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5756, decode.acc_seg: 79.1432, aux.loss_ce: 0.2480, aux.acc_seg: 78.7546, loss: 0.8236, grad_norm: 6.5106
2023-02-19 04:35:32,109 - mmseg - INFO - Iter [6650/160000]	lr: 5.751e-05, eta: 12:11:03, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5293, decode.acc_seg: 80.0810, aux.loss_ce: 0.2294, aux.acc_seg: 79.6171, loss: 0.7587, grad_norm: 5.8753
2023-02-19 04:35:45,849 - mmseg - INFO - Iter [6700/160000]	lr: 5.749e-05, eta: 12:10:36, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5617, decode.acc_seg: 79.3696, aux.loss_ce: 0.2420, aux.acc_seg: 78.8270, loss: 0.8038, grad_norm: 6.1265
2023-02-19 04:35:59,510 - mmseg - INFO - Iter [6750/160000]	lr: 5.747e-05, eta: 12:10:07, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5457, decode.acc_seg: 79.7029, aux.loss_ce: 0.2382, aux.acc_seg: 78.6994, loss: 0.7839, grad_norm: 8.4197
2023-02-19 04:36:13,408 - mmseg - INFO - Iter [6800/160000]	lr: 5.745e-05, eta: 12:09:44, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5248, decode.acc_seg: 80.2327, aux.loss_ce: 0.2217, aux.acc_seg: 79.9761, loss: 0.7465, grad_norm: 7.6456
2023-02-19 04:36:27,632 - mmseg - INFO - Iter [6850/160000]	lr: 5.743e-05, eta: 12:09:28, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5410, decode.acc_seg: 80.3819, aux.loss_ce: 0.2323, aux.acc_seg: 79.8202, loss: 0.7732, grad_norm: 6.0862
2023-02-19 04:36:41,537 - mmseg - INFO - Iter [6900/160000]	lr: 5.741e-05, eta: 12:09:05, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5533, decode.acc_seg: 79.7503, aux.loss_ce: 0.2406, aux.acc_seg: 78.6553, loss: 0.7939, grad_norm: 6.6488
2023-02-19 04:36:55,319 - mmseg - INFO - Iter [6950/160000]	lr: 5.739e-05, eta: 12:08:39, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5806, decode.acc_seg: 78.6756, aux.loss_ce: 0.2459, aux.acc_seg: 78.5524, loss: 0.8265, grad_norm: 5.7709
2023-02-19 04:37:09,036 - mmseg - INFO - Saving checkpoint at 7000 iterations
2023-02-19 04:37:12,338 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:37:12,338 - mmseg - INFO - Iter [7000/160000]	lr: 5.738e-05, eta: 12:09:26, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5329, decode.acc_seg: 79.9040, aux.loss_ce: 0.2275, aux.acc_seg: 79.7697, loss: 0.7604, grad_norm: 6.6210
2023-02-19 04:37:25,875 - mmseg - INFO - Iter [7050/160000]	lr: 5.736e-05, eta: 12:08:55, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5498, decode.acc_seg: 79.4513, aux.loss_ce: 0.2364, aux.acc_seg: 79.1345, loss: 0.7862, grad_norm: 5.5800
2023-02-19 04:37:40,138 - mmseg - INFO - Iter [7100/160000]	lr: 5.734e-05, eta: 12:08:40, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5262, decode.acc_seg: 79.7333, aux.loss_ce: 0.2255, aux.acc_seg: 79.1508, loss: 0.7517, grad_norm: 6.3234
2023-02-19 04:37:53,856 - mmseg - INFO - Iter [7150/160000]	lr: 5.732e-05, eta: 12:08:13, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5541, decode.acc_seg: 79.9933, aux.loss_ce: 0.2338, aux.acc_seg: 79.6246, loss: 0.7880, grad_norm: 6.0945
2023-02-19 04:38:07,844 - mmseg - INFO - Iter [7200/160000]	lr: 5.730e-05, eta: 12:07:52, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5427, decode.acc_seg: 79.5424, aux.loss_ce: 0.2283, aux.acc_seg: 79.2170, loss: 0.7709, grad_norm: 5.3850
2023-02-19 04:38:21,970 - mmseg - INFO - Iter [7250/160000]	lr: 5.728e-05, eta: 12:07:34, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5462, decode.acc_seg: 79.8099, aux.loss_ce: 0.2309, aux.acc_seg: 79.4620, loss: 0.7771, grad_norm: 5.4094
2023-02-19 04:38:35,567 - mmseg - INFO - Iter [7300/160000]	lr: 5.726e-05, eta: 12:07:05, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5572, decode.acc_seg: 79.2789, aux.loss_ce: 0.2355, aux.acc_seg: 78.7262, loss: 0.7927, grad_norm: 5.0542
2023-02-19 04:38:50,238 - mmseg - INFO - Iter [7350/160000]	lr: 5.724e-05, eta: 12:06:59, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5411, decode.acc_seg: 80.2261, aux.loss_ce: 0.2299, aux.acc_seg: 79.7758, loss: 0.7710, grad_norm: 5.8419
2023-02-19 04:39:03,962 - mmseg - INFO - Iter [7400/160000]	lr: 5.723e-05, eta: 12:06:33, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5623, decode.acc_seg: 79.1423, aux.loss_ce: 0.2342, aux.acc_seg: 79.2436, loss: 0.7965, grad_norm: 6.0218
2023-02-19 04:39:17,685 - mmseg - INFO - Iter [7450/160000]	lr: 5.721e-05, eta: 12:06:08, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5475, decode.acc_seg: 79.5913, aux.loss_ce: 0.2314, aux.acc_seg: 78.9488, loss: 0.7789, grad_norm: 5.7134
2023-02-19 04:39:31,449 - mmseg - INFO - Iter [7500/160000]	lr: 5.719e-05, eta: 12:05:43, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5607, decode.acc_seg: 78.9987, aux.loss_ce: 0.2363, aux.acc_seg: 78.7902, loss: 0.7969, grad_norm: 4.8849
2023-02-19 04:39:45,215 - mmseg - INFO - Iter [7550/160000]	lr: 5.717e-05, eta: 12:05:18, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5432, decode.acc_seg: 79.7323, aux.loss_ce: 0.2268, aux.acc_seg: 79.4234, loss: 0.7700, grad_norm: 5.9319
2023-02-19 04:40:01,509 - mmseg - INFO - Iter [7600/160000]	lr: 5.715e-05, eta: 12:05:44, time: 0.325, data_time: 0.047, memory: 15214, decode.loss_ce: 0.5406, decode.acc_seg: 80.4059, aux.loss_ce: 0.2316, aux.acc_seg: 79.6392, loss: 0.7722, grad_norm: 4.5965
2023-02-19 04:40:15,431 - mmseg - INFO - Iter [7650/160000]	lr: 5.713e-05, eta: 12:05:23, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5629, decode.acc_seg: 79.4854, aux.loss_ce: 0.2381, aux.acc_seg: 79.3051, loss: 0.8010, grad_norm: 6.0114
2023-02-19 04:40:29,334 - mmseg - INFO - Iter [7700/160000]	lr: 5.711e-05, eta: 12:05:01, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5097, decode.acc_seg: 80.1563, aux.loss_ce: 0.2191, aux.acc_seg: 79.5661, loss: 0.7289, grad_norm: 4.5598
2023-02-19 04:40:43,119 - mmseg - INFO - Iter [7750/160000]	lr: 5.709e-05, eta: 12:04:37, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5517, decode.acc_seg: 79.6868, aux.loss_ce: 0.2368, aux.acc_seg: 79.6340, loss: 0.7885, grad_norm: 5.4073
2023-02-19 04:40:56,742 - mmseg - INFO - Iter [7800/160000]	lr: 5.708e-05, eta: 12:04:10, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5194, decode.acc_seg: 80.8713, aux.loss_ce: 0.2202, aux.acc_seg: 80.1830, loss: 0.7396, grad_norm: 5.9555
2023-02-19 04:41:11,176 - mmseg - INFO - Iter [7850/160000]	lr: 5.706e-05, eta: 12:03:59, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4994, decode.acc_seg: 80.9348, aux.loss_ce: 0.2108, aux.acc_seg: 80.4322, loss: 0.7102, grad_norm: 5.0043
2023-02-19 04:41:25,867 - mmseg - INFO - Iter [7900/160000]	lr: 5.704e-05, eta: 12:03:53, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5341, decode.acc_seg: 79.6452, aux.loss_ce: 0.2264, aux.acc_seg: 79.4463, loss: 0.7604, grad_norm: 5.8729
2023-02-19 04:41:39,590 - mmseg - INFO - Iter [7950/160000]	lr: 5.702e-05, eta: 12:03:28, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5156, decode.acc_seg: 80.9130, aux.loss_ce: 0.2243, aux.acc_seg: 80.0321, loss: 0.7399, grad_norm: 5.0511
2023-02-19 04:41:54,137 - mmseg - INFO - Saving checkpoint at 8000 iterations
2023-02-19 04:41:57,364 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:41:57,364 - mmseg - INFO - Iter [8000/160000]	lr: 5.700e-05, eta: 12:04:20, time: 0.356, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5157, decode.acc_seg: 80.2484, aux.loss_ce: 0.2215, aux.acc_seg: 79.5067, loss: 0.7372, grad_norm: 5.7896
2023-02-19 04:42:11,634 - mmseg - INFO - Iter [8050/160000]	lr: 5.698e-05, eta: 12:04:05, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5093, decode.acc_seg: 80.7741, aux.loss_ce: 0.2139, aux.acc_seg: 80.4693, loss: 0.7231, grad_norm: 5.1903
2023-02-19 04:42:25,221 - mmseg - INFO - Iter [8100/160000]	lr: 5.696e-05, eta: 12:03:37, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5185, decode.acc_seg: 80.8008, aux.loss_ce: 0.2210, aux.acc_seg: 80.3528, loss: 0.7395, grad_norm: 4.4627
2023-02-19 04:42:39,912 - mmseg - INFO - Iter [8150/160000]	lr: 5.694e-05, eta: 12:03:31, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5361, decode.acc_seg: 79.9630, aux.loss_ce: 0.2272, aux.acc_seg: 79.4772, loss: 0.7633, grad_norm: 5.6520
2023-02-19 04:42:53,817 - mmseg - INFO - Iter [8200/160000]	lr: 5.693e-05, eta: 12:03:09, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5202, decode.acc_seg: 80.8835, aux.loss_ce: 0.2265, aux.acc_seg: 79.9304, loss: 0.7467, grad_norm: 5.0139
2023-02-19 04:43:07,386 - mmseg - INFO - Iter [8250/160000]	lr: 5.691e-05, eta: 12:02:42, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5084, decode.acc_seg: 80.9602, aux.loss_ce: 0.2165, aux.acc_seg: 80.8500, loss: 0.7249, grad_norm: 5.3529
2023-02-19 04:43:22,196 - mmseg - INFO - Iter [8300/160000]	lr: 5.689e-05, eta: 12:02:36, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5094, decode.acc_seg: 80.8658, aux.loss_ce: 0.2119, aux.acc_seg: 80.8440, loss: 0.7213, grad_norm: 4.9467
2023-02-19 04:43:36,193 - mmseg - INFO - Iter [8350/160000]	lr: 5.687e-05, eta: 12:02:17, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5007, decode.acc_seg: 81.3895, aux.loss_ce: 0.2131, aux.acc_seg: 80.4852, loss: 0.7138, grad_norm: 5.9356
2023-02-19 04:43:50,222 - mmseg - INFO - Iter [8400/160000]	lr: 5.685e-05, eta: 12:01:58, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5317, decode.acc_seg: 80.2471, aux.loss_ce: 0.2242, aux.acc_seg: 79.5978, loss: 0.7559, grad_norm: 5.2020
2023-02-19 04:44:04,020 - mmseg - INFO - Iter [8450/160000]	lr: 5.683e-05, eta: 12:01:35, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5323, decode.acc_seg: 80.1590, aux.loss_ce: 0.2241, aux.acc_seg: 79.6591, loss: 0.7564, grad_norm: 4.8201
2023-02-19 04:44:17,740 - mmseg - INFO - Iter [8500/160000]	lr: 5.681e-05, eta: 12:01:11, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5285, decode.acc_seg: 80.2474, aux.loss_ce: 0.2236, aux.acc_seg: 79.8440, loss: 0.7521, grad_norm: 5.1349
2023-02-19 04:44:31,574 - mmseg - INFO - Iter [8550/160000]	lr: 5.679e-05, eta: 12:00:49, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5156, decode.acc_seg: 80.6467, aux.loss_ce: 0.2156, aux.acc_seg: 80.2415, loss: 0.7311, grad_norm: 5.4190
2023-02-19 04:44:45,862 - mmseg - INFO - Iter [8600/160000]	lr: 5.678e-05, eta: 12:00:34, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5004, decode.acc_seg: 80.8779, aux.loss_ce: 0.2163, aux.acc_seg: 80.2750, loss: 0.7167, grad_norm: 5.7854
2023-02-19 04:45:00,073 - mmseg - INFO - Iter [8650/160000]	lr: 5.676e-05, eta: 12:00:19, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4986, decode.acc_seg: 81.0110, aux.loss_ce: 0.2134, aux.acc_seg: 80.3053, loss: 0.7120, grad_norm: 6.1549
2023-02-19 04:45:13,892 - mmseg - INFO - Iter [8700/160000]	lr: 5.674e-05, eta: 11:59:57, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5167, decode.acc_seg: 80.4178, aux.loss_ce: 0.2171, aux.acc_seg: 80.0751, loss: 0.7338, grad_norm: 5.2580
2023-02-19 04:45:28,060 - mmseg - INFO - Iter [8750/160000]	lr: 5.672e-05, eta: 11:59:41, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5052, decode.acc_seg: 81.6313, aux.loss_ce: 0.2183, aux.acc_seg: 80.6485, loss: 0.7235, grad_norm: 5.6058
2023-02-19 04:45:41,924 - mmseg - INFO - Iter [8800/160000]	lr: 5.670e-05, eta: 11:59:19, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5177, decode.acc_seg: 80.2392, aux.loss_ce: 0.2180, aux.acc_seg: 79.8859, loss: 0.7357, grad_norm: 5.4378
2023-02-19 04:45:58,930 - mmseg - INFO - Iter [8850/160000]	lr: 5.668e-05, eta: 11:59:52, time: 0.340, data_time: 0.048, memory: 15214, decode.loss_ce: 0.4953, decode.acc_seg: 80.7778, aux.loss_ce: 0.2100, aux.acc_seg: 80.3522, loss: 0.7053, grad_norm: 5.9590
2023-02-19 04:46:12,940 - mmseg - INFO - Iter [8900/160000]	lr: 5.666e-05, eta: 11:59:33, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4901, decode.acc_seg: 81.2628, aux.loss_ce: 0.2091, aux.acc_seg: 80.7693, loss: 0.6992, grad_norm: 6.3853
2023-02-19 04:46:27,237 - mmseg - INFO - Iter [8950/160000]	lr: 5.664e-05, eta: 11:59:18, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4786, decode.acc_seg: 82.0737, aux.loss_ce: 0.2033, aux.acc_seg: 81.4350, loss: 0.6820, grad_norm: 4.4442
2023-02-19 04:46:41,464 - mmseg - INFO - Saving checkpoint at 9000 iterations
2023-02-19 04:46:44,685 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:46:44,685 - mmseg - INFO - Iter [9000/160000]	lr: 5.663e-05, eta: 11:59:57, time: 0.350, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4873, decode.acc_seg: 80.7177, aux.loss_ce: 0.2067, aux.acc_seg: 80.1163, loss: 0.6940, grad_norm: 5.2054
2023-02-19 04:46:58,850 - mmseg - INFO - Iter [9050/160000]	lr: 5.661e-05, eta: 11:59:40, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4922, decode.acc_seg: 81.3300, aux.loss_ce: 0.2119, aux.acc_seg: 80.7035, loss: 0.7040, grad_norm: 5.3005
2023-02-19 04:47:12,712 - mmseg - INFO - Iter [9100/160000]	lr: 5.659e-05, eta: 11:59:18, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5091, decode.acc_seg: 81.2270, aux.loss_ce: 0.2164, aux.acc_seg: 80.8740, loss: 0.7255, grad_norm: 5.2464
2023-02-19 04:47:26,686 - mmseg - INFO - Iter [9150/160000]	lr: 5.657e-05, eta: 11:58:59, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4713, decode.acc_seg: 82.0560, aux.loss_ce: 0.2054, aux.acc_seg: 81.4005, loss: 0.6768, grad_norm: 5.2057
2023-02-19 04:47:40,599 - mmseg - INFO - Iter [9200/160000]	lr: 5.655e-05, eta: 11:58:39, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5092, decode.acc_seg: 81.4701, aux.loss_ce: 0.2167, aux.acc_seg: 80.5118, loss: 0.7259, grad_norm: 6.2725
2023-02-19 04:47:54,515 - mmseg - INFO - Iter [9250/160000]	lr: 5.653e-05, eta: 11:58:18, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5185, decode.acc_seg: 80.4731, aux.loss_ce: 0.2184, aux.acc_seg: 79.6554, loss: 0.7369, grad_norm: 5.3795
2023-02-19 04:48:08,323 - mmseg - INFO - Iter [9300/160000]	lr: 5.651e-05, eta: 11:57:56, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5071, decode.acc_seg: 81.2595, aux.loss_ce: 0.2129, aux.acc_seg: 80.7167, loss: 0.7200, grad_norm: 5.6466
2023-02-19 04:48:22,267 - mmseg - INFO - Iter [9350/160000]	lr: 5.649e-05, eta: 11:57:36, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5073, decode.acc_seg: 80.6527, aux.loss_ce: 0.2139, aux.acc_seg: 80.0513, loss: 0.7212, grad_norm: 5.4380
2023-02-19 04:48:36,621 - mmseg - INFO - Iter [9400/160000]	lr: 5.648e-05, eta: 11:57:23, time: 0.287, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4760, decode.acc_seg: 81.5409, aux.loss_ce: 0.2024, aux.acc_seg: 81.0325, loss: 0.6784, grad_norm: 5.2935
2023-02-19 04:48:51,426 - mmseg - INFO - Iter [9450/160000]	lr: 5.646e-05, eta: 11:57:16, time: 0.296, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4804, decode.acc_seg: 81.6935, aux.loss_ce: 0.2030, aux.acc_seg: 81.4365, loss: 0.6834, grad_norm: 5.1383
2023-02-19 04:49:05,125 - mmseg - INFO - Iter [9500/160000]	lr: 5.644e-05, eta: 11:56:53, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4813, decode.acc_seg: 81.2640, aux.loss_ce: 0.2066, aux.acc_seg: 80.5307, loss: 0.6879, grad_norm: 4.5454
2023-02-19 04:49:18,801 - mmseg - INFO - Iter [9550/160000]	lr: 5.642e-05, eta: 11:56:29, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4734, decode.acc_seg: 82.3887, aux.loss_ce: 0.2040, aux.acc_seg: 81.4233, loss: 0.6773, grad_norm: 5.2221
2023-02-19 04:49:33,370 - mmseg - INFO - Iter [9600/160000]	lr: 5.640e-05, eta: 11:56:19, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4948, decode.acc_seg: 81.3370, aux.loss_ce: 0.2119, aux.acc_seg: 80.6557, loss: 0.7067, grad_norm: 6.3066
2023-02-19 04:49:46,998 - mmseg - INFO - Iter [9650/160000]	lr: 5.638e-05, eta: 11:55:55, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.5070, decode.acc_seg: 80.8725, aux.loss_ce: 0.2091, aux.acc_seg: 80.8279, loss: 0.7161, grad_norm: 4.6679
2023-02-19 04:50:00,749 - mmseg - INFO - Iter [9700/160000]	lr: 5.636e-05, eta: 11:55:32, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4956, decode.acc_seg: 81.4464, aux.loss_ce: 0.2094, aux.acc_seg: 80.9563, loss: 0.7050, grad_norm: 4.7641
2023-02-19 04:50:14,343 - mmseg - INFO - Iter [9750/160000]	lr: 5.634e-05, eta: 11:55:07, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5149, decode.acc_seg: 80.4884, aux.loss_ce: 0.2144, aux.acc_seg: 80.3959, loss: 0.7293, grad_norm: 5.4076
2023-02-19 04:50:28,914 - mmseg - INFO - Iter [9800/160000]	lr: 5.633e-05, eta: 11:54:57, time: 0.291, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5301, decode.acc_seg: 80.4244, aux.loss_ce: 0.2176, aux.acc_seg: 80.4638, loss: 0.7476, grad_norm: 5.4339
2023-02-19 04:50:43,934 - mmseg - INFO - Iter [9850/160000]	lr: 5.631e-05, eta: 11:54:54, time: 0.300, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4937, decode.acc_seg: 80.9056, aux.loss_ce: 0.2074, aux.acc_seg: 80.4301, loss: 0.7011, grad_norm: 5.3895
2023-02-19 04:50:57,639 - mmseg - INFO - Iter [9900/160000]	lr: 5.629e-05, eta: 11:54:31, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4904, decode.acc_seg: 81.8453, aux.loss_ce: 0.2058, aux.acc_seg: 81.2726, loss: 0.6962, grad_norm: 5.4186
2023-02-19 04:51:11,417 - mmseg - INFO - Iter [9950/160000]	lr: 5.627e-05, eta: 11:54:09, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4860, decode.acc_seg: 80.7853, aux.loss_ce: 0.2087, aux.acc_seg: 80.0788, loss: 0.6947, grad_norm: 4.8765
2023-02-19 04:51:25,261 - mmseg - INFO - Saving checkpoint at 10000 iterations
2023-02-19 04:51:28,549 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:51:28,549 - mmseg - INFO - Iter [10000/160000]	lr: 5.625e-05, eta: 11:54:38, time: 0.343, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4963, decode.acc_seg: 81.4859, aux.loss_ce: 0.2106, aux.acc_seg: 80.7322, loss: 0.7069, grad_norm: 5.5313
2023-02-19 04:51:42,091 - mmseg - INFO - Iter [10050/160000]	lr: 5.623e-05, eta: 11:54:12, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5119, decode.acc_seg: 81.0662, aux.loss_ce: 0.2157, aux.acc_seg: 80.6078, loss: 0.7276, grad_norm: 5.5895
2023-02-19 04:51:55,798 - mmseg - INFO - Iter [10100/160000]	lr: 5.621e-05, eta: 11:53:50, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.5159, decode.acc_seg: 80.1635, aux.loss_ce: 0.2178, aux.acc_seg: 79.8611, loss: 0.7337, grad_norm: 5.6168
2023-02-19 04:52:11,662 - mmseg - INFO - Iter [10150/160000]	lr: 5.619e-05, eta: 11:53:59, time: 0.317, data_time: 0.047, memory: 15214, decode.loss_ce: 0.5058, decode.acc_seg: 81.2202, aux.loss_ce: 0.2142, aux.acc_seg: 80.7217, loss: 0.7201, grad_norm: 5.0385
2023-02-19 04:52:25,271 - mmseg - INFO - Iter [10200/160000]	lr: 5.618e-05, eta: 11:53:34, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4627, decode.acc_seg: 82.6519, aux.loss_ce: 0.1976, aux.acc_seg: 81.8366, loss: 0.6603, grad_norm: 5.8727
2023-02-19 04:52:39,049 - mmseg - INFO - Iter [10250/160000]	lr: 5.616e-05, eta: 11:53:12, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4549, decode.acc_seg: 83.1226, aux.loss_ce: 0.1945, aux.acc_seg: 82.6651, loss: 0.6494, grad_norm: 4.5053
2023-02-19 04:52:52,695 - mmseg - INFO - Iter [10300/160000]	lr: 5.614e-05, eta: 11:52:49, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4425, decode.acc_seg: 83.0086, aux.loss_ce: 0.1920, aux.acc_seg: 82.6600, loss: 0.6345, grad_norm: 4.9425
2023-02-19 04:53:06,402 - mmseg - INFO - Iter [10350/160000]	lr: 5.612e-05, eta: 11:52:26, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4543, decode.acc_seg: 82.7015, aux.loss_ce: 0.1950, aux.acc_seg: 81.9114, loss: 0.6494, grad_norm: 4.5771
2023-02-19 04:53:20,046 - mmseg - INFO - Iter [10400/160000]	lr: 5.610e-05, eta: 11:52:03, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4604, decode.acc_seg: 82.7271, aux.loss_ce: 0.1971, aux.acc_seg: 82.0659, loss: 0.6575, grad_norm: 5.0272
2023-02-19 04:53:33,562 - mmseg - INFO - Iter [10450/160000]	lr: 5.608e-05, eta: 11:51:37, time: 0.270, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4470, decode.acc_seg: 82.8645, aux.loss_ce: 0.1907, aux.acc_seg: 82.1307, loss: 0.6377, grad_norm: 4.4121
2023-02-19 04:53:47,179 - mmseg - INFO - Iter [10500/160000]	lr: 5.606e-05, eta: 11:51:14, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4822, decode.acc_seg: 81.9278, aux.loss_ce: 0.2034, aux.acc_seg: 81.3564, loss: 0.6857, grad_norm: 4.4703
2023-02-19 04:54:01,312 - mmseg - INFO - Iter [10550/160000]	lr: 5.604e-05, eta: 11:50:58, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4591, decode.acc_seg: 82.7146, aux.loss_ce: 0.1978, aux.acc_seg: 81.9218, loss: 0.6569, grad_norm: 4.8998
2023-02-19 04:54:14,922 - mmseg - INFO - Iter [10600/160000]	lr: 5.603e-05, eta: 11:50:34, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4633, decode.acc_seg: 82.7249, aux.loss_ce: 0.1969, aux.acc_seg: 82.0483, loss: 0.6602, grad_norm: 4.7708
2023-02-19 04:54:28,811 - mmseg - INFO - Iter [10650/160000]	lr: 5.601e-05, eta: 11:50:14, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4673, decode.acc_seg: 81.7182, aux.loss_ce: 0.1999, aux.acc_seg: 80.9339, loss: 0.6672, grad_norm: 4.7270
2023-02-19 04:54:42,938 - mmseg - INFO - Iter [10700/160000]	lr: 5.599e-05, eta: 11:49:58, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4727, decode.acc_seg: 81.6630, aux.loss_ce: 0.1996, aux.acc_seg: 81.3419, loss: 0.6723, grad_norm: 5.1896
2023-02-19 04:54:56,706 - mmseg - INFO - Iter [10750/160000]	lr: 5.597e-05, eta: 11:49:37, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4585, decode.acc_seg: 81.9470, aux.loss_ce: 0.1942, aux.acc_seg: 81.5150, loss: 0.6527, grad_norm: 5.3289
2023-02-19 04:55:10,467 - mmseg - INFO - Iter [10800/160000]	lr: 5.595e-05, eta: 11:49:16, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4752, decode.acc_seg: 81.8344, aux.loss_ce: 0.1992, aux.acc_seg: 81.3555, loss: 0.6743, grad_norm: 5.5376
2023-02-19 04:55:24,457 - mmseg - INFO - Iter [10850/160000]	lr: 5.593e-05, eta: 11:48:58, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4793, decode.acc_seg: 81.7449, aux.loss_ce: 0.2012, aux.acc_seg: 81.1577, loss: 0.6805, grad_norm: 4.4737
2023-02-19 04:55:38,102 - mmseg - INFO - Iter [10900/160000]	lr: 5.591e-05, eta: 11:48:35, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4818, decode.acc_seg: 81.5522, aux.loss_ce: 0.2052, aux.acc_seg: 81.1523, loss: 0.6870, grad_norm: 4.8203
2023-02-19 04:55:52,725 - mmseg - INFO - Iter [10950/160000]	lr: 5.589e-05, eta: 11:48:26, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4426, decode.acc_seg: 82.8784, aux.loss_ce: 0.1900, aux.acc_seg: 82.1580, loss: 0.6326, grad_norm: 4.3616
2023-02-19 04:56:06,725 - mmseg - INFO - Saving checkpoint at 11000 iterations
2023-02-19 04:56:10,007 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 04:56:10,007 - mmseg - INFO - Iter [11000/160000]	lr: 5.588e-05, eta: 11:48:53, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4826, decode.acc_seg: 81.6901, aux.loss_ce: 0.2044, aux.acc_seg: 81.1927, loss: 0.6870, grad_norm: 5.3877
2023-02-19 04:56:24,391 - mmseg - INFO - Iter [11050/160000]	lr: 5.586e-05, eta: 11:48:40, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4696, decode.acc_seg: 82.4936, aux.loss_ce: 0.2049, aux.acc_seg: 81.4038, loss: 0.6745, grad_norm: 4.3906
2023-02-19 04:56:39,079 - mmseg - INFO - Iter [11100/160000]	lr: 5.584e-05, eta: 11:48:31, time: 0.294, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4928, decode.acc_seg: 81.6175, aux.loss_ce: 0.2080, aux.acc_seg: 81.0199, loss: 0.7007, grad_norm: 5.1795
2023-02-19 04:56:53,405 - mmseg - INFO - Iter [11150/160000]	lr: 5.582e-05, eta: 11:48:17, time: 0.287, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4821, decode.acc_seg: 81.4389, aux.loss_ce: 0.2010, aux.acc_seg: 81.0431, loss: 0.6831, grad_norm: 5.1286
2023-02-19 04:57:06,993 - mmseg - INFO - Iter [11200/160000]	lr: 5.580e-05, eta: 11:47:54, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4838, decode.acc_seg: 81.8246, aux.loss_ce: 0.2059, aux.acc_seg: 80.9406, loss: 0.6897, grad_norm: 4.4263
2023-02-19 04:57:20,680 - mmseg - INFO - Iter [11250/160000]	lr: 5.578e-05, eta: 11:47:32, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4495, decode.acc_seg: 82.9579, aux.loss_ce: 0.1906, aux.acc_seg: 82.2138, loss: 0.6401, grad_norm: 7.2150
2023-02-19 04:57:35,044 - mmseg - INFO - Iter [11300/160000]	lr: 5.576e-05, eta: 11:47:19, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4477, decode.acc_seg: 82.3316, aux.loss_ce: 0.1897, aux.acc_seg: 81.9297, loss: 0.6375, grad_norm: 4.3510
2023-02-19 04:57:49,624 - mmseg - INFO - Iter [11350/160000]	lr: 5.574e-05, eta: 11:47:08, time: 0.291, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4634, decode.acc_seg: 82.2995, aux.loss_ce: 0.1927, aux.acc_seg: 81.8949, loss: 0.6561, grad_norm: 5.0971
2023-02-19 04:58:05,914 - mmseg - INFO - Iter [11400/160000]	lr: 5.573e-05, eta: 11:47:20, time: 0.326, data_time: 0.048, memory: 15214, decode.loss_ce: 0.4762, decode.acc_seg: 81.9126, aux.loss_ce: 0.1986, aux.acc_seg: 81.4982, loss: 0.6748, grad_norm: 5.3230
2023-02-19 04:58:19,560 - mmseg - INFO - Iter [11450/160000]	lr: 5.571e-05, eta: 11:46:58, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4221, decode.acc_seg: 83.8119, aux.loss_ce: 0.1838, aux.acc_seg: 82.7900, loss: 0.6059, grad_norm: 4.6241
2023-02-19 04:58:33,378 - mmseg - INFO - Iter [11500/160000]	lr: 5.569e-05, eta: 11:46:38, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4320, decode.acc_seg: 83.2687, aux.loss_ce: 0.1855, aux.acc_seg: 82.7895, loss: 0.6175, grad_norm: 4.2525
2023-02-19 04:58:47,199 - mmseg - INFO - Iter [11550/160000]	lr: 5.567e-05, eta: 11:46:18, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4401, decode.acc_seg: 82.9103, aux.loss_ce: 0.1878, aux.acc_seg: 82.3073, loss: 0.6279, grad_norm: 4.5175
2023-02-19 04:59:01,098 - mmseg - INFO - Iter [11600/160000]	lr: 5.565e-05, eta: 11:45:59, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4455, decode.acc_seg: 82.7768, aux.loss_ce: 0.1889, aux.acc_seg: 82.1798, loss: 0.6345, grad_norm: 4.3265
2023-02-19 04:59:15,427 - mmseg - INFO - Iter [11650/160000]	lr: 5.563e-05, eta: 11:45:45, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4534, decode.acc_seg: 82.5863, aux.loss_ce: 0.1910, aux.acc_seg: 82.2256, loss: 0.6444, grad_norm: 4.8494
2023-02-19 04:59:29,003 - mmseg - INFO - Iter [11700/160000]	lr: 5.561e-05, eta: 11:45:22, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4571, decode.acc_seg: 82.6223, aux.loss_ce: 0.1945, aux.acc_seg: 82.1611, loss: 0.6516, grad_norm: 4.9155
2023-02-19 04:59:42,968 - mmseg - INFO - Iter [11750/160000]	lr: 5.559e-05, eta: 11:45:04, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4451, decode.acc_seg: 83.4561, aux.loss_ce: 0.1902, aux.acc_seg: 82.7928, loss: 0.6353, grad_norm: 4.6028
2023-02-19 04:59:57,411 - mmseg - INFO - Iter [11800/160000]	lr: 5.558e-05, eta: 11:44:52, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4751, decode.acc_seg: 82.3955, aux.loss_ce: 0.2037, aux.acc_seg: 81.2551, loss: 0.6788, grad_norm: 5.3978
2023-02-19 05:00:11,296 - mmseg - INFO - Iter [11850/160000]	lr: 5.556e-05, eta: 11:44:33, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4358, decode.acc_seg: 83.4560, aux.loss_ce: 0.1853, aux.acc_seg: 82.7508, loss: 0.6211, grad_norm: 7.4108
2023-02-19 05:00:25,358 - mmseg - INFO - Iter [11900/160000]	lr: 5.554e-05, eta: 11:44:16, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4639, decode.acc_seg: 82.3262, aux.loss_ce: 0.1977, aux.acc_seg: 81.5854, loss: 0.6615, grad_norm: 4.5046
2023-02-19 05:00:38,935 - mmseg - INFO - Iter [11950/160000]	lr: 5.552e-05, eta: 11:43:53, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4631, decode.acc_seg: 82.5062, aux.loss_ce: 0.1937, aux.acc_seg: 81.8775, loss: 0.6569, grad_norm: 4.5620
2023-02-19 05:00:53,572 - mmseg - INFO - Saving checkpoint at 12000 iterations
2023-02-19 05:00:56,816 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:00:56,816 - mmseg - INFO - Iter [12000/160000]	lr: 5.550e-05, eta: 11:44:24, time: 0.358, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4378, decode.acc_seg: 83.2842, aux.loss_ce: 0.1865, aux.acc_seg: 82.6773, loss: 0.6243, grad_norm: 4.7354
2023-02-19 05:01:10,880 - mmseg - INFO - Iter [12050/160000]	lr: 5.548e-05, eta: 11:44:07, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4790, decode.acc_seg: 81.8175, aux.loss_ce: 0.2019, aux.acc_seg: 81.4267, loss: 0.6809, grad_norm: 4.7994
2023-02-19 05:01:24,679 - mmseg - INFO - Iter [12100/160000]	lr: 5.546e-05, eta: 11:43:47, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4928, decode.acc_seg: 81.2383, aux.loss_ce: 0.2079, aux.acc_seg: 80.4761, loss: 0.7007, grad_norm: 4.9686
2023-02-19 05:01:38,223 - mmseg - INFO - Iter [12150/160000]	lr: 5.544e-05, eta: 11:43:23, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4310, decode.acc_seg: 83.4326, aux.loss_ce: 0.1835, aux.acc_seg: 82.7919, loss: 0.6145, grad_norm: 4.6180
2023-02-19 05:01:52,318 - mmseg - INFO - Iter [12200/160000]	lr: 5.543e-05, eta: 11:43:07, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4546, decode.acc_seg: 82.4503, aux.loss_ce: 0.1948, aux.acc_seg: 81.6610, loss: 0.6494, grad_norm: 4.5734
2023-02-19 05:02:06,282 - mmseg - INFO - Iter [12250/160000]	lr: 5.541e-05, eta: 11:42:49, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4650, decode.acc_seg: 82.7995, aux.loss_ce: 0.1949, aux.acc_seg: 82.1326, loss: 0.6599, grad_norm: 5.0799
2023-02-19 05:02:19,958 - mmseg - INFO - Iter [12300/160000]	lr: 5.539e-05, eta: 11:42:28, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4630, decode.acc_seg: 82.0280, aux.loss_ce: 0.1982, aux.acc_seg: 81.5404, loss: 0.6612, grad_norm: 5.7598
2023-02-19 05:02:33,674 - mmseg - INFO - Iter [12350/160000]	lr: 5.537e-05, eta: 11:42:07, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4596, decode.acc_seg: 81.9522, aux.loss_ce: 0.1936, aux.acc_seg: 81.5324, loss: 0.6533, grad_norm: 4.4978
2023-02-19 05:02:47,322 - mmseg - INFO - Iter [12400/160000]	lr: 5.535e-05, eta: 11:41:45, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4589, decode.acc_seg: 82.8992, aux.loss_ce: 0.1966, aux.acc_seg: 82.0815, loss: 0.6555, grad_norm: 4.7728
2023-02-19 05:03:00,893 - mmseg - INFO - Iter [12450/160000]	lr: 5.533e-05, eta: 11:41:23, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4449, decode.acc_seg: 82.9491, aux.loss_ce: 0.1924, aux.acc_seg: 82.0590, loss: 0.6373, grad_norm: 4.1685
2023-02-19 05:03:14,575 - mmseg - INFO - Iter [12500/160000]	lr: 5.531e-05, eta: 11:41:02, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4602, decode.acc_seg: 82.5197, aux.loss_ce: 0.1942, aux.acc_seg: 81.9155, loss: 0.6544, grad_norm: 5.7012
2023-02-19 05:03:28,136 - mmseg - INFO - Iter [12550/160000]	lr: 5.529e-05, eta: 11:40:39, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4455, decode.acc_seg: 83.3198, aux.loss_ce: 0.1908, aux.acc_seg: 82.5277, loss: 0.6362, grad_norm: 4.4667
2023-02-19 05:03:42,045 - mmseg - INFO - Iter [12600/160000]	lr: 5.528e-05, eta: 11:40:20, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4557, decode.acc_seg: 83.1805, aux.loss_ce: 0.1929, aux.acc_seg: 82.3897, loss: 0.6486, grad_norm: 4.2641
2023-02-19 05:03:58,682 - mmseg - INFO - Iter [12650/160000]	lr: 5.526e-05, eta: 11:40:34, time: 0.333, data_time: 0.049, memory: 15214, decode.loss_ce: 0.4602, decode.acc_seg: 82.2741, aux.loss_ce: 0.1927, aux.acc_seg: 81.8076, loss: 0.6529, grad_norm: 4.2281
2023-02-19 05:04:12,602 - mmseg - INFO - Iter [12700/160000]	lr: 5.524e-05, eta: 11:40:16, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4387, decode.acc_seg: 83.1371, aux.loss_ce: 0.1871, aux.acc_seg: 82.3837, loss: 0.6258, grad_norm: 4.3229
2023-02-19 05:04:26,331 - mmseg - INFO - Iter [12750/160000]	lr: 5.522e-05, eta: 11:39:56, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4417, decode.acc_seg: 82.9372, aux.loss_ce: 0.1850, aux.acc_seg: 82.5283, loss: 0.6267, grad_norm: 4.8379
2023-02-19 05:04:39,981 - mmseg - INFO - Iter [12800/160000]	lr: 5.520e-05, eta: 11:39:34, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4197, decode.acc_seg: 83.8877, aux.loss_ce: 0.1780, aux.acc_seg: 83.2501, loss: 0.5977, grad_norm: 4.4182
2023-02-19 05:04:54,157 - mmseg - INFO - Iter [12850/160000]	lr: 5.518e-05, eta: 11:39:19, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4286, decode.acc_seg: 83.4908, aux.loss_ce: 0.1859, aux.acc_seg: 82.8910, loss: 0.6145, grad_norm: 4.7745
2023-02-19 05:05:08,106 - mmseg - INFO - Iter [12900/160000]	lr: 5.516e-05, eta: 11:39:01, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4334, decode.acc_seg: 83.9112, aux.loss_ce: 0.1837, aux.acc_seg: 82.9564, loss: 0.6172, grad_norm: 4.7172
2023-02-19 05:05:22,133 - mmseg - INFO - Iter [12950/160000]	lr: 5.514e-05, eta: 11:38:44, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4389, decode.acc_seg: 83.0710, aux.loss_ce: 0.1879, aux.acc_seg: 82.5496, loss: 0.6268, grad_norm: 4.4941
2023-02-19 05:05:35,744 - mmseg - INFO - Saving checkpoint at 13000 iterations
2023-02-19 05:05:38,964 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:05:38,964 - mmseg - INFO - Iter [13000/160000]	lr: 5.513e-05, eta: 11:38:59, time: 0.337, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4640, decode.acc_seg: 82.7653, aux.loss_ce: 0.1965, aux.acc_seg: 82.1709, loss: 0.6606, grad_norm: 4.8163
2023-02-19 05:05:52,657 - mmseg - INFO - Iter [13050/160000]	lr: 5.511e-05, eta: 11:38:39, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4260, decode.acc_seg: 83.5198, aux.loss_ce: 0.1827, aux.acc_seg: 82.7057, loss: 0.6087, grad_norm: 4.0939
2023-02-19 05:06:06,214 - mmseg - INFO - Iter [13100/160000]	lr: 5.509e-05, eta: 11:38:16, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4344, decode.acc_seg: 82.5954, aux.loss_ce: 0.1804, aux.acc_seg: 82.3484, loss: 0.6148, grad_norm: 4.8879
2023-02-19 05:06:20,380 - mmseg - INFO - Iter [13150/160000]	lr: 5.507e-05, eta: 11:38:01, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4526, decode.acc_seg: 82.8211, aux.loss_ce: 0.1907, aux.acc_seg: 82.1710, loss: 0.6433, grad_norm: 4.8435
2023-02-19 05:06:34,094 - mmseg - INFO - Iter [13200/160000]	lr: 5.505e-05, eta: 11:37:41, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4118, decode.acc_seg: 83.6556, aux.loss_ce: 0.1777, aux.acc_seg: 82.7328, loss: 0.5895, grad_norm: 4.4497
2023-02-19 05:06:48,777 - mmseg - INFO - Iter [13250/160000]	lr: 5.503e-05, eta: 11:37:31, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4460, decode.acc_seg: 83.5844, aux.loss_ce: 0.1914, aux.acc_seg: 82.9651, loss: 0.6374, grad_norm: 4.9169
2023-02-19 05:07:02,939 - mmseg - INFO - Iter [13300/160000]	lr: 5.501e-05, eta: 11:37:16, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4186, decode.acc_seg: 84.0520, aux.loss_ce: 0.1818, aux.acc_seg: 82.9819, loss: 0.6004, grad_norm: 4.8887
2023-02-19 05:07:16,888 - mmseg - INFO - Iter [13350/160000]	lr: 5.499e-05, eta: 11:36:58, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4681, decode.acc_seg: 82.1358, aux.loss_ce: 0.2001, aux.acc_seg: 81.4690, loss: 0.6682, grad_norm: 5.3413
2023-02-19 05:07:30,654 - mmseg - INFO - Iter [13400/160000]	lr: 5.498e-05, eta: 11:36:39, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4336, decode.acc_seg: 83.5222, aux.loss_ce: 0.1878, aux.acc_seg: 82.9446, loss: 0.6214, grad_norm: 4.4854
2023-02-19 05:07:44,769 - mmseg - INFO - Iter [13450/160000]	lr: 5.496e-05, eta: 11:36:23, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4418, decode.acc_seg: 83.4543, aux.loss_ce: 0.1840, aux.acc_seg: 82.7976, loss: 0.6257, grad_norm: 4.8570
2023-02-19 05:07:58,594 - mmseg - INFO - Iter [13500/160000]	lr: 5.494e-05, eta: 11:36:04, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4725, decode.acc_seg: 82.2831, aux.loss_ce: 0.2006, aux.acc_seg: 81.4549, loss: 0.6732, grad_norm: 6.0509
2023-02-19 05:08:12,164 - mmseg - INFO - Iter [13550/160000]	lr: 5.492e-05, eta: 11:35:42, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4201, decode.acc_seg: 84.1547, aux.loss_ce: 0.1810, aux.acc_seg: 83.2839, loss: 0.6011, grad_norm: 5.3046
2023-02-19 05:08:25,921 - mmseg - INFO - Iter [13600/160000]	lr: 5.490e-05, eta: 11:35:23, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4340, decode.acc_seg: 83.6869, aux.loss_ce: 0.1836, aux.acc_seg: 82.9860, loss: 0.6176, grad_norm: 4.5071
2023-02-19 05:08:39,502 - mmseg - INFO - Iter [13650/160000]	lr: 5.488e-05, eta: 11:35:01, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4479, decode.acc_seg: 82.8095, aux.loss_ce: 0.1873, aux.acc_seg: 82.4849, loss: 0.6352, grad_norm: 4.6896
2023-02-19 05:08:53,583 - mmseg - INFO - Iter [13700/160000]	lr: 5.486e-05, eta: 11:34:45, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4403, decode.acc_seg: 83.1523, aux.loss_ce: 0.1875, aux.acc_seg: 82.2417, loss: 0.6278, grad_norm: 4.7968
2023-02-19 05:09:07,165 - mmseg - INFO - Iter [13750/160000]	lr: 5.484e-05, eta: 11:34:24, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4253, decode.acc_seg: 83.4570, aux.loss_ce: 0.1806, aux.acc_seg: 82.9968, loss: 0.6059, grad_norm: 4.1605
2023-02-19 05:09:21,425 - mmseg - INFO - Iter [13800/160000]	lr: 5.483e-05, eta: 11:34:10, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4369, decode.acc_seg: 83.1291, aux.loss_ce: 0.1873, aux.acc_seg: 82.2214, loss: 0.6242, grad_norm: 4.5488
2023-02-19 05:09:35,852 - mmseg - INFO - Iter [13850/160000]	lr: 5.481e-05, eta: 11:33:57, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4211, decode.acc_seg: 84.0691, aux.loss_ce: 0.1812, aux.acc_seg: 83.1032, loss: 0.6023, grad_norm: 5.2836
2023-02-19 05:09:51,813 - mmseg - INFO - Iter [13900/160000]	lr: 5.479e-05, eta: 11:34:01, time: 0.319, data_time: 0.048, memory: 15214, decode.loss_ce: 0.4154, decode.acc_seg: 83.9330, aux.loss_ce: 0.1793, aux.acc_seg: 83.1626, loss: 0.5947, grad_norm: 4.5887
2023-02-19 05:10:05,947 - mmseg - INFO - Iter [13950/160000]	lr: 5.477e-05, eta: 11:33:46, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4078, decode.acc_seg: 84.1290, aux.loss_ce: 0.1766, aux.acc_seg: 83.3719, loss: 0.5844, grad_norm: 4.6663
2023-02-19 05:10:19,950 - mmseg - INFO - Saving checkpoint at 14000 iterations
2023-02-19 05:10:23,262 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:10:23,262 - mmseg - INFO - Iter [14000/160000]	lr: 5.475e-05, eta: 11:34:04, time: 0.346, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4012, decode.acc_seg: 84.9208, aux.loss_ce: 0.1743, aux.acc_seg: 83.9205, loss: 0.5755, grad_norm: 3.7309
2023-02-19 05:10:36,873 - mmseg - INFO - Iter [14050/160000]	lr: 5.473e-05, eta: 11:33:42, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4371, decode.acc_seg: 83.4021, aux.loss_ce: 0.1906, aux.acc_seg: 82.2471, loss: 0.6276, grad_norm: 4.2274
2023-02-19 05:10:50,850 - mmseg - INFO - Iter [14100/160000]	lr: 5.471e-05, eta: 11:33:25, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4338, decode.acc_seg: 83.4680, aux.loss_ce: 0.1860, aux.acc_seg: 82.5988, loss: 0.6198, grad_norm: 4.8808
2023-02-19 05:11:04,938 - mmseg - INFO - Iter [14150/160000]	lr: 5.469e-05, eta: 11:33:09, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3991, decode.acc_seg: 84.9080, aux.loss_ce: 0.1722, aux.acc_seg: 84.0540, loss: 0.5713, grad_norm: 4.2369
2023-02-19 05:11:18,571 - mmseg - INFO - Iter [14200/160000]	lr: 5.468e-05, eta: 11:32:49, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4511, decode.acc_seg: 83.3305, aux.loss_ce: 0.1956, aux.acc_seg: 82.0954, loss: 0.6467, grad_norm: 3.8727
2023-02-19 05:11:32,349 - mmseg - INFO - Iter [14250/160000]	lr: 5.466e-05, eta: 11:32:29, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4064, decode.acc_seg: 84.2997, aux.loss_ce: 0.1720, aux.acc_seg: 83.3940, loss: 0.5784, grad_norm: 3.6567
2023-02-19 05:11:46,237 - mmseg - INFO - Iter [14300/160000]	lr: 5.464e-05, eta: 11:32:11, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4016, decode.acc_seg: 84.6434, aux.loss_ce: 0.1730, aux.acc_seg: 83.8386, loss: 0.5745, grad_norm: 4.0407
2023-02-19 05:11:59,812 - mmseg - INFO - Iter [14350/160000]	lr: 5.462e-05, eta: 11:31:50, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4094, decode.acc_seg: 84.1652, aux.loss_ce: 0.1754, aux.acc_seg: 83.1639, loss: 0.5848, grad_norm: 4.4137
2023-02-19 05:12:13,622 - mmseg - INFO - Iter [14400/160000]	lr: 5.460e-05, eta: 11:31:32, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4323, decode.acc_seg: 83.2743, aux.loss_ce: 0.1855, aux.acc_seg: 82.4267, loss: 0.6178, grad_norm: 4.9567
2023-02-19 05:12:27,189 - mmseg - INFO - Iter [14450/160000]	lr: 5.458e-05, eta: 11:31:11, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4153, decode.acc_seg: 83.8739, aux.loss_ce: 0.1811, aux.acc_seg: 82.7636, loss: 0.5965, grad_norm: 4.3502
2023-02-19 05:12:42,285 - mmseg - INFO - Iter [14500/160000]	lr: 5.456e-05, eta: 11:31:05, time: 0.302, data_time: 0.006, memory: 15214, decode.loss_ce: 0.3974, decode.acc_seg: 84.2804, aux.loss_ce: 0.1706, aux.acc_seg: 83.5269, loss: 0.5680, grad_norm: 4.2094
2023-02-19 05:12:56,371 - mmseg - INFO - Iter [14550/160000]	lr: 5.454e-05, eta: 11:30:49, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4029, decode.acc_seg: 84.5229, aux.loss_ce: 0.1711, aux.acc_seg: 83.8845, loss: 0.5740, grad_norm: 3.9577
2023-02-19 05:13:10,286 - mmseg - INFO - Iter [14600/160000]	lr: 5.453e-05, eta: 11:30:31, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4523, decode.acc_seg: 82.9623, aux.loss_ce: 0.1905, aux.acc_seg: 82.3810, loss: 0.6427, grad_norm: 4.9800
2023-02-19 05:13:23,901 - mmseg - INFO - Iter [14650/160000]	lr: 5.451e-05, eta: 11:30:11, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4245, decode.acc_seg: 83.5010, aux.loss_ce: 0.1818, aux.acc_seg: 82.5019, loss: 0.6063, grad_norm: 4.8066
2023-02-19 05:13:37,597 - mmseg - INFO - Iter [14700/160000]	lr: 5.449e-05, eta: 11:29:51, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4161, decode.acc_seg: 84.3045, aux.loss_ce: 0.1821, aux.acc_seg: 83.1369, loss: 0.5983, grad_norm: 5.2636
2023-02-19 05:13:51,727 - mmseg - INFO - Iter [14750/160000]	lr: 5.447e-05, eta: 11:29:36, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4070, decode.acc_seg: 84.5237, aux.loss_ce: 0.1738, aux.acc_seg: 83.5847, loss: 0.5808, grad_norm: 5.2941
2023-02-19 05:14:06,136 - mmseg - INFO - Iter [14800/160000]	lr: 5.445e-05, eta: 11:29:23, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4199, decode.acc_seg: 84.1888, aux.loss_ce: 0.1790, aux.acc_seg: 83.3839, loss: 0.5989, grad_norm: 4.8454
2023-02-19 05:14:19,909 - mmseg - INFO - Iter [14850/160000]	lr: 5.443e-05, eta: 11:29:04, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4239, decode.acc_seg: 83.7357, aux.loss_ce: 0.1811, aux.acc_seg: 82.8805, loss: 0.6050, grad_norm: 5.9824
2023-02-19 05:14:33,658 - mmseg - INFO - Iter [14900/160000]	lr: 5.441e-05, eta: 11:28:45, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4187, decode.acc_seg: 83.8675, aux.loss_ce: 0.1759, aux.acc_seg: 83.5664, loss: 0.5946, grad_norm: 4.9050
2023-02-19 05:14:47,610 - mmseg - INFO - Iter [14950/160000]	lr: 5.439e-05, eta: 11:28:28, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4125, decode.acc_seg: 84.6411, aux.loss_ce: 0.1785, aux.acc_seg: 83.6557, loss: 0.5911, grad_norm: 4.0157
2023-02-19 05:15:01,505 - mmseg - INFO - Saving checkpoint at 15000 iterations
2023-02-19 05:15:04,734 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:15:04,734 - mmseg - INFO - Iter [15000/160000]	lr: 5.438e-05, eta: 11:28:42, time: 0.343, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4531, decode.acc_seg: 82.4007, aux.loss_ce: 0.1918, aux.acc_seg: 81.9344, loss: 0.6449, grad_norm: 4.5509
2023-02-19 05:15:18,467 - mmseg - INFO - Iter [15050/160000]	lr: 5.436e-05, eta: 11:28:23, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4042, decode.acc_seg: 83.8497, aux.loss_ce: 0.1716, aux.acc_seg: 83.1996, loss: 0.5757, grad_norm: 4.1705
2023-02-19 05:15:32,066 - mmseg - INFO - Iter [15100/160000]	lr: 5.434e-05, eta: 11:28:02, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4277, decode.acc_seg: 83.2030, aux.loss_ce: 0.1818, aux.acc_seg: 82.6561, loss: 0.6094, grad_norm: 3.9195
2023-02-19 05:15:46,694 - mmseg - INFO - Iter [15150/160000]	lr: 5.432e-05, eta: 11:27:52, time: 0.293, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4127, decode.acc_seg: 84.1454, aux.loss_ce: 0.1793, aux.acc_seg: 83.1939, loss: 0.5920, grad_norm: 3.9360
2023-02-19 05:16:02,552 - mmseg - INFO - Iter [15200/160000]	lr: 5.430e-05, eta: 11:27:53, time: 0.317, data_time: 0.047, memory: 15214, decode.loss_ce: 0.3856, decode.acc_seg: 85.4488, aux.loss_ce: 0.1709, aux.acc_seg: 84.0171, loss: 0.5565, grad_norm: 3.8257
2023-02-19 05:16:16,362 - mmseg - INFO - Iter [15250/160000]	lr: 5.428e-05, eta: 11:27:34, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3878, decode.acc_seg: 84.8283, aux.loss_ce: 0.1684, aux.acc_seg: 83.9715, loss: 0.5562, grad_norm: 3.8185
2023-02-19 05:16:30,905 - mmseg - INFO - Iter [15300/160000]	lr: 5.426e-05, eta: 11:27:23, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4073, decode.acc_seg: 84.5377, aux.loss_ce: 0.1733, aux.acc_seg: 83.6747, loss: 0.5806, grad_norm: 3.9999
2023-02-19 05:16:45,257 - mmseg - INFO - Iter [15350/160000]	lr: 5.424e-05, eta: 11:27:09, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3948, decode.acc_seg: 84.6670, aux.loss_ce: 0.1695, aux.acc_seg: 83.6255, loss: 0.5643, grad_norm: 5.0772
2023-02-19 05:16:58,761 - mmseg - INFO - Iter [15400/160000]	lr: 5.423e-05, eta: 11:26:48, time: 0.270, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3996, decode.acc_seg: 85.2031, aux.loss_ce: 0.1704, aux.acc_seg: 84.4344, loss: 0.5700, grad_norm: 3.9172
2023-02-19 05:17:12,817 - mmseg - INFO - Iter [15450/160000]	lr: 5.421e-05, eta: 11:26:32, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3864, decode.acc_seg: 84.9131, aux.loss_ce: 0.1669, aux.acc_seg: 84.1177, loss: 0.5533, grad_norm: 4.9772
2023-02-19 05:17:27,027 - mmseg - INFO - Iter [15500/160000]	lr: 5.419e-05, eta: 11:26:17, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4148, decode.acc_seg: 84.0847, aux.loss_ce: 0.1805, aux.acc_seg: 82.6890, loss: 0.5953, grad_norm: 4.1850
2023-02-19 05:17:40,621 - mmseg - INFO - Iter [15550/160000]	lr: 5.417e-05, eta: 11:25:57, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4060, decode.acc_seg: 84.4261, aux.loss_ce: 0.1726, aux.acc_seg: 83.6473, loss: 0.5786, grad_norm: 5.3159
2023-02-19 05:17:55,090 - mmseg - INFO - Iter [15600/160000]	lr: 5.415e-05, eta: 11:25:45, time: 0.289, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3972, decode.acc_seg: 84.0595, aux.loss_ce: 0.1731, aux.acc_seg: 82.9975, loss: 0.5703, grad_norm: 4.4952
2023-02-19 05:18:08,852 - mmseg - INFO - Iter [15650/160000]	lr: 5.413e-05, eta: 11:25:26, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4134, decode.acc_seg: 84.4574, aux.loss_ce: 0.1785, aux.acc_seg: 83.3303, loss: 0.5919, grad_norm: 4.8583
2023-02-19 05:18:22,632 - mmseg - INFO - Iter [15700/160000]	lr: 5.411e-05, eta: 11:25:08, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3891, decode.acc_seg: 85.4599, aux.loss_ce: 0.1699, aux.acc_seg: 84.2679, loss: 0.5590, grad_norm: 4.2566
2023-02-19 05:18:36,479 - mmseg - INFO - Iter [15750/160000]	lr: 5.409e-05, eta: 11:24:50, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4167, decode.acc_seg: 83.8379, aux.loss_ce: 0.1773, aux.acc_seg: 83.1919, loss: 0.5941, grad_norm: 4.5161
2023-02-19 05:18:50,717 - mmseg - INFO - Iter [15800/160000]	lr: 5.408e-05, eta: 11:24:36, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4119, decode.acc_seg: 84.4272, aux.loss_ce: 0.1725, aux.acc_seg: 83.9402, loss: 0.5844, grad_norm: 4.4431
2023-02-19 05:19:04,693 - mmseg - INFO - Iter [15850/160000]	lr: 5.406e-05, eta: 11:24:19, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3935, decode.acc_seg: 84.7255, aux.loss_ce: 0.1722, aux.acc_seg: 83.6728, loss: 0.5657, grad_norm: 4.2167
2023-02-19 05:19:18,743 - mmseg - INFO - Iter [15900/160000]	lr: 5.404e-05, eta: 11:24:03, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4062, decode.acc_seg: 84.6172, aux.loss_ce: 0.1760, aux.acc_seg: 83.6033, loss: 0.5823, grad_norm: 3.9977
2023-02-19 05:19:32,702 - mmseg - INFO - Iter [15950/160000]	lr: 5.402e-05, eta: 11:23:46, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3944, decode.acc_seg: 84.9092, aux.loss_ce: 0.1703, aux.acc_seg: 83.9897, loss: 0.5647, grad_norm: 3.9923
2023-02-19 05:19:46,409 - mmseg - INFO - Saving checkpoint at 16000 iterations
2023-02-19 05:19:49,761 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:19:49,761 - mmseg - INFO - Iter [16000/160000]	lr: 5.400e-05, eta: 11:23:57, time: 0.341, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4003, decode.acc_seg: 84.7443, aux.loss_ce: 0.1741, aux.acc_seg: 83.6177, loss: 0.5744, grad_norm: 4.1926
2023-02-19 05:20:10,391 - mmseg - INFO - per class results:
2023-02-19 05:20:10,397 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        | 75.32 | 88.58 |
|       building      | 82.73 | 93.02 |
|         sky         | 94.28 | 96.93 |
|        floor        | 80.39 | 89.71 |
|         tree        | 74.42 | 85.83 |
|       ceiling       | 81.67 | 88.04 |
|         road        | 82.16 | 89.05 |
|         bed         | 88.68 | 96.42 |
|      windowpane     | 60.47 | 78.44 |
|        grass        | 65.42 | 81.54 |
|       cabinet       | 55.71 | 60.55 |
|       sidewalk      | 65.61 | 86.98 |
|        person       | 80.11 | 91.93 |
|        earth        | 29.57 | 35.84 |
|         door        | 43.09 | 50.26 |
|        table        |  55.8 | 71.25 |
|       mountain      | 59.03 | 77.05 |
|        plant        | 49.17 |  70.3 |
|       curtain       | 67.27 | 73.61 |
|        chair        | 56.74 | 71.65 |
|         car         | 80.51 | 95.35 |
|        water        | 56.69 | 73.35 |
|       painting      |  71.8 |  82.1 |
|         sofa        | 65.79 |  88.1 |
|        shelf        | 42.08 | 52.32 |
|        house        | 37.57 |  39.8 |
|         sea         | 58.94 | 89.14 |
|        mirror       | 53.92 | 56.74 |
|         rug         | 64.94 | 79.11 |
|        field        | 30.93 | 58.28 |
|       armchair      |  41.8 | 55.34 |
|         seat        | 61.33 | 79.55 |
|        fence        |  46.7 | 67.44 |
|         desk        | 45.76 | 78.61 |
|         rock        | 50.98 | 60.49 |
|       wardrobe      | 47.55 | 60.43 |
|         lamp        | 57.57 | 82.16 |
|       bathtub       |  75.8 | 79.73 |
|       railing       | 32.76 | 50.77 |
|       cushion       | 53.87 | 62.79 |
|         base        | 30.92 | 72.99 |
|         box         | 24.86 | 32.58 |
|        column       | 44.51 |  58.1 |
|      signboard      |  32.7 | 44.62 |
|   chest of drawers  | 39.49 |  73.2 |
|       counter       |  9.25 |  9.58 |
|         sand        | 43.99 | 60.68 |
|         sink        | 68.58 | 80.19 |
|      skyscraper     | 64.71 | 87.96 |
|      fireplace      | 65.12 | 96.58 |
|     refrigerator    | 66.84 | 90.94 |
|      grandstand     | 44.08 | 63.55 |
|         path        | 20.65 | 29.47 |
|        stairs       | 34.41 | 48.89 |
|        runway       | 68.47 | 92.35 |
|         case        | 63.49 | 82.48 |
|      pool table     | 91.52 |  96.5 |
|        pillow       | 53.13 | 60.94 |
|     screen door     | 50.97 | 59.13 |
|       stairway      | 40.14 | 49.37 |
|        river        | 12.76 | 17.78 |
|        bridge       | 61.11 | 63.57 |
|       bookcase      | 35.38 | 50.45 |
|        blind        | 45.09 | 53.13 |
|     coffee table    | 47.68 | 89.16 |
|        toilet       | 84.48 | 89.06 |
|        flower       | 34.87 | 75.73 |
|         book        | 39.08 | 52.02 |
|         hill        |  4.82 |  5.34 |
|        bench        |  48.7 | 54.07 |
|      countertop     | 42.63 | 44.66 |
|        stove        | 69.96 | 88.34 |
|         palm        |  52.9 | 69.52 |
|    kitchen island   | 39.33 | 70.91 |
|       computer      | 52.41 | 58.94 |
|     swivel chair    | 55.37 |  77.3 |
|         boat        | 35.12 | 35.59 |
|         bar         | 44.86 | 60.86 |
|    arcade machine   | 83.63 | 94.22 |
|        hovel        | 55.92 | 59.76 |
|         bus         | 75.84 | 97.92 |
|        towel        | 64.39 |  77.3 |
|        light        | 43.48 | 47.94 |
|        truck        | 26.79 | 56.17 |
|        tower        | 27.67 | 56.48 |
|      chandelier     | 61.71 | 67.87 |
|        awning       | 27.92 | 35.43 |
|     streetlight     | 21.19 | 30.75 |
|        booth        | 52.92 | 69.15 |
| television receiver | 68.56 | 77.76 |
|       airplane      | 41.56 | 68.87 |
|      dirt track     |  0.0  |  0.0  |
|       apparel       | 31.02 | 65.71 |
|         pole        | 10.98 |  13.7 |
|         land        |  6.09 | 24.43 |
|      bannister      |  13.2 | 15.78 |
|      escalator      | 29.99 | 37.79 |
|       ottoman       | 45.17 | 68.24 |
|        bottle       |  27.4 | 36.28 |
|        buffet       | 44.28 | 49.27 |
|        poster       | 29.57 | 42.89 |
|        stage        |  10.6 | 12.39 |
|         van         | 34.61 |  41.0 |
|         ship        | 30.53 |  46.6 |
|       fountain      | 24.74 |  25.7 |
|    conveyer belt    | 79.56 | 89.97 |
|        canopy       | 40.44 | 45.78 |
|        washer       | 76.85 | 83.01 |
|      plaything      |  30.0 | 46.12 |
|    swimming pool    | 48.84 | 77.06 |
|        stool        | 42.06 | 57.99 |
|        barrel       | 45.24 | 65.02 |
|        basket       | 29.53 | 54.19 |
|      waterfall      |  47.3 | 48.64 |
|         tent        | 92.41 |  97.7 |
|         bag         | 12.59 | 13.91 |
|       minibike      | 69.11 | 81.96 |
|        cradle       | 73.72 | 89.33 |
|         oven        |  6.62 |  6.62 |
|         ball        | 53.49 | 67.84 |
|         food        | 59.25 | 77.36 |
|         step        |  3.43 |  3.46 |
|         tank        | 57.39 | 66.89 |
|      trade name     |  6.78 |  6.96 |
|      microwave      | 82.79 | 95.13 |
|         pot         |  40.6 | 44.15 |
|        animal       | 60.79 | 64.17 |
|       bicycle       | 58.82 | 76.72 |
|         lake        |  0.04 |  0.04 |
|      dishwasher     | 59.83 | 73.85 |
|        screen       | 50.72 | 64.45 |
|       blanket       |  7.55 |  8.56 |
|      sculpture      | 65.46 | 82.01 |
|         hood        | 44.71 | 50.01 |
|        sconce       | 23.51 | 25.37 |
|         vase        | 28.37 | 37.15 |
|    traffic light    | 28.68 | 41.55 |
|         tray        |  1.85 |  2.06 |
|        ashcan       | 38.98 | 62.28 |
|         fan         | 57.69 |  80.2 |
|         pier        | 67.82 | 82.32 |
|      crt screen     |  5.57 | 42.36 |
|        plate        | 53.75 | 74.76 |
|       monitor       |  1.38 |  1.48 |
|    bulletin board   | 40.18 | 52.45 |
|        shower       |  0.0  |  0.0  |
|       radiator      | 61.94 | 73.52 |
|        glass        |  9.11 |  9.27 |
|        clock        | 29.47 | 35.67 |
|         flag        | 58.56 | 66.14 |
+---------------------+-------+-------+
2023-02-19 05:20:10,398 - mmseg - INFO - Summary:
2023-02-19 05:20:10,398 - mmseg - INFO - 
+-------+-------+-------+
|  aAcc |  mIoU |  mAcc |
+-------+-------+-------+
| 82.02 | 46.88 | 59.73 |
+-------+-------+-------+
2023-02-19 05:20:13,431 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_16000.pth.
2023-02-19 05:20:13,431 - mmseg - INFO - Best mIoU is 0.4688 at 16000 iter.
2023-02-19 05:20:13,431 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:20:13,431 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8202, mIoU: 0.4688, mAcc: 0.5973, IoU.wall: 0.7532, IoU.building: 0.8273, IoU.sky: 0.9428, IoU.floor: 0.8039, IoU.tree: 0.7442, IoU.ceiling: 0.8167, IoU.road: 0.8216, IoU.bed : 0.8868, IoU.windowpane: 0.6047, IoU.grass: 0.6542, IoU.cabinet: 0.5571, IoU.sidewalk: 0.6561, IoU.person: 0.8011, IoU.earth: 0.2957, IoU.door: 0.4309, IoU.table: 0.5580, IoU.mountain: 0.5903, IoU.plant: 0.4917, IoU.curtain: 0.6727, IoU.chair: 0.5674, IoU.car: 0.8051, IoU.water: 0.5669, IoU.painting: 0.7180, IoU.sofa: 0.6579, IoU.shelf: 0.4208, IoU.house: 0.3757, IoU.sea: 0.5894, IoU.mirror: 0.5392, IoU.rug: 0.6494, IoU.field: 0.3093, IoU.armchair: 0.4180, IoU.seat: 0.6133, IoU.fence: 0.4670, IoU.desk: 0.4576, IoU.rock: 0.5098, IoU.wardrobe: 0.4755, IoU.lamp: 0.5757, IoU.bathtub: 0.7580, IoU.railing: 0.3276, IoU.cushion: 0.5387, IoU.base: 0.3092, IoU.box: 0.2486, IoU.column: 0.4451, IoU.signboard: 0.3270, IoU.chest of drawers: 0.3949, IoU.counter: 0.0925, IoU.sand: 0.4399, IoU.sink: 0.6858, IoU.skyscraper: 0.6471, IoU.fireplace: 0.6512, IoU.refrigerator: 0.6684, IoU.grandstand: 0.4408, IoU.path: 0.2065, IoU.stairs: 0.3441, IoU.runway: 0.6847, IoU.case: 0.6349, IoU.pool table: 0.9152, IoU.pillow: 0.5313, IoU.screen door: 0.5097, IoU.stairway: 0.4014, IoU.river: 0.1276, IoU.bridge: 0.6111, IoU.bookcase: 0.3538, IoU.blind: 0.4509, IoU.coffee table: 0.4768, IoU.toilet: 0.8448, IoU.flower: 0.3487, IoU.book: 0.3908, IoU.hill: 0.0482, IoU.bench: 0.4870, IoU.countertop: 0.4263, IoU.stove: 0.6996, IoU.palm: 0.5290, IoU.kitchen island: 0.3933, IoU.computer: 0.5241, IoU.swivel chair: 0.5537, IoU.boat: 0.3512, IoU.bar: 0.4486, IoU.arcade machine: 0.8363, IoU.hovel: 0.5592, IoU.bus: 0.7584, IoU.towel: 0.6439, IoU.light: 0.4348, IoU.truck: 0.2679, IoU.tower: 0.2767, IoU.chandelier: 0.6171, IoU.awning: 0.2792, IoU.streetlight: 0.2119, IoU.booth: 0.5292, IoU.television receiver: 0.6856, IoU.airplane: 0.4156, IoU.dirt track: 0.0000, IoU.apparel: 0.3102, IoU.pole: 0.1098, IoU.land: 0.0609, IoU.bannister: 0.1320, IoU.escalator: 0.2999, IoU.ottoman: 0.4517, IoU.bottle: 0.2740, IoU.buffet: 0.4428, IoU.poster: 0.2957, IoU.stage: 0.1060, IoU.van: 0.3461, IoU.ship: 0.3053, IoU.fountain: 0.2474, IoU.conveyer belt: 0.7956, IoU.canopy: 0.4044, IoU.washer: 0.7685, IoU.plaything: 0.3000, IoU.swimming pool: 0.4884, IoU.stool: 0.4206, IoU.barrel: 0.4524, IoU.basket: 0.2953, IoU.waterfall: 0.4730, IoU.tent: 0.9241, IoU.bag: 0.1259, IoU.minibike: 0.6911, IoU.cradle: 0.7372, IoU.oven: 0.0662, IoU.ball: 0.5349, IoU.food: 0.5925, IoU.step: 0.0343, IoU.tank: 0.5739, IoU.trade name: 0.0678, IoU.microwave: 0.8279, IoU.pot: 0.4060, IoU.animal: 0.6079, IoU.bicycle: 0.5882, IoU.lake: 0.0004, IoU.dishwasher: 0.5983, IoU.screen: 0.5072, IoU.blanket: 0.0755, IoU.sculpture: 0.6546, IoU.hood: 0.4471, IoU.sconce: 0.2351, IoU.vase: 0.2837, IoU.traffic light: 0.2868, IoU.tray: 0.0185, IoU.ashcan: 0.3898, IoU.fan: 0.5769, IoU.pier: 0.6782, IoU.crt screen: 0.0557, IoU.plate: 0.5375, IoU.monitor: 0.0138, IoU.bulletin board: 0.4018, IoU.shower: 0.0000, IoU.radiator: 0.6194, IoU.glass: 0.0911, IoU.clock: 0.2947, IoU.flag: 0.5856, Acc.wall: 0.8858, Acc.building: 0.9302, Acc.sky: 0.9693, Acc.floor: 0.8971, Acc.tree: 0.8583, Acc.ceiling: 0.8804, Acc.road: 0.8905, Acc.bed : 0.9642, Acc.windowpane: 0.7844, Acc.grass: 0.8154, Acc.cabinet: 0.6055, Acc.sidewalk: 0.8698, Acc.person: 0.9193, Acc.earth: 0.3584, Acc.door: 0.5026, Acc.table: 0.7125, Acc.mountain: 0.7705, Acc.plant: 0.7030, Acc.curtain: 0.7361, Acc.chair: 0.7165, Acc.car: 0.9535, Acc.water: 0.7335, Acc.painting: 0.8210, Acc.sofa: 0.8810, Acc.shelf: 0.5232, Acc.house: 0.3980, Acc.sea: 0.8914, Acc.mirror: 0.5674, Acc.rug: 0.7911, Acc.field: 0.5828, Acc.armchair: 0.5534, Acc.seat: 0.7955, Acc.fence: 0.6744, Acc.desk: 0.7861, Acc.rock: 0.6049, Acc.wardrobe: 0.6043, Acc.lamp: 0.8216, Acc.bathtub: 0.7973, Acc.railing: 0.5077, Acc.cushion: 0.6279, Acc.base: 0.7299, Acc.box: 0.3258, Acc.column: 0.5810, Acc.signboard: 0.4462, Acc.chest of drawers: 0.7320, Acc.counter: 0.0958, Acc.sand: 0.6068, Acc.sink: 0.8019, Acc.skyscraper: 0.8796, Acc.fireplace: 0.9658, Acc.refrigerator: 0.9094, Acc.grandstand: 0.6355, Acc.path: 0.2947, Acc.stairs: 0.4889, Acc.runway: 0.9235, Acc.case: 0.8248, Acc.pool table: 0.9650, Acc.pillow: 0.6094, Acc.screen door: 0.5913, Acc.stairway: 0.4937, Acc.river: 0.1778, Acc.bridge: 0.6357, Acc.bookcase: 0.5045, Acc.blind: 0.5313, Acc.coffee table: 0.8916, Acc.toilet: 0.8906, Acc.flower: 0.7573, Acc.book: 0.5202, Acc.hill: 0.0534, Acc.bench: 0.5407, Acc.countertop: 0.4466, Acc.stove: 0.8834, Acc.palm: 0.6952, Acc.kitchen island: 0.7091, Acc.computer: 0.5894, Acc.swivel chair: 0.7730, Acc.boat: 0.3559, Acc.bar: 0.6086, Acc.arcade machine: 0.9422, Acc.hovel: 0.5976, Acc.bus: 0.9792, Acc.towel: 0.7730, Acc.light: 0.4794, Acc.truck: 0.5617, Acc.tower: 0.5648, Acc.chandelier: 0.6787, Acc.awning: 0.3543, Acc.streetlight: 0.3075, Acc.booth: 0.6915, Acc.television receiver: 0.7776, Acc.airplane: 0.6887, Acc.dirt track: 0.0000, Acc.apparel: 0.6571, Acc.pole: 0.1370, Acc.land: 0.2443, Acc.bannister: 0.1578, Acc.escalator: 0.3779, Acc.ottoman: 0.6824, Acc.bottle: 0.3628, Acc.buffet: 0.4927, Acc.poster: 0.4289, Acc.stage: 0.1239, Acc.van: 0.4100, Acc.ship: 0.4660, Acc.fountain: 0.2570, Acc.conveyer belt: 0.8997, Acc.canopy: 0.4578, Acc.washer: 0.8301, Acc.plaything: 0.4612, Acc.swimming pool: 0.7706, Acc.stool: 0.5799, Acc.barrel: 0.6502, Acc.basket: 0.5419, Acc.waterfall: 0.4864, Acc.tent: 0.9770, Acc.bag: 0.1391, Acc.minibike: 0.8196, Acc.cradle: 0.8933, Acc.oven: 0.0662, Acc.ball: 0.6784, Acc.food: 0.7736, Acc.step: 0.0346, Acc.tank: 0.6689, Acc.trade name: 0.0696, Acc.microwave: 0.9513, Acc.pot: 0.4415, Acc.animal: 0.6417, Acc.bicycle: 0.7672, Acc.lake: 0.0004, Acc.dishwasher: 0.7385, Acc.screen: 0.6445, Acc.blanket: 0.0856, Acc.sculpture: 0.8201, Acc.hood: 0.5001, Acc.sconce: 0.2537, Acc.vase: 0.3715, Acc.traffic light: 0.4155, Acc.tray: 0.0206, Acc.ashcan: 0.6228, Acc.fan: 0.8020, Acc.pier: 0.8232, Acc.crt screen: 0.4236, Acc.plate: 0.7476, Acc.monitor: 0.0148, Acc.bulletin board: 0.5245, Acc.shower: 0.0000, Acc.radiator: 0.7352, Acc.glass: 0.0927, Acc.clock: 0.3567, Acc.flag: 0.6614
2023-02-19 05:20:27,459 - mmseg - INFO - Iter [16050/160000]	lr: 5.398e-05, eta: 11:27:13, time: 0.754, data_time: 0.478, memory: 15214, decode.loss_ce: 0.4182, decode.acc_seg: 83.7516, aux.loss_ce: 0.1761, aux.acc_seg: 83.1240, loss: 0.5943, grad_norm: 5.5774
2023-02-19 05:20:41,459 - mmseg - INFO - Iter [16100/160000]	lr: 5.396e-05, eta: 11:26:56, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4313, decode.acc_seg: 83.5738, aux.loss_ce: 0.1850, aux.acc_seg: 82.5919, loss: 0.6163, grad_norm: 4.4501
2023-02-19 05:20:55,898 - mmseg - INFO - Iter [16150/160000]	lr: 5.394e-05, eta: 11:26:43, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4125, decode.acc_seg: 84.5450, aux.loss_ce: 0.1806, aux.acc_seg: 83.5641, loss: 0.5931, grad_norm: 4.1576
2023-02-19 05:21:09,767 - mmseg - INFO - Iter [16200/160000]	lr: 5.393e-05, eta: 11:26:25, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4031, decode.acc_seg: 84.2367, aux.loss_ce: 0.1765, aux.acc_seg: 83.3386, loss: 0.5797, grad_norm: 4.3826
2023-02-19 05:21:24,684 - mmseg - INFO - Iter [16250/160000]	lr: 5.391e-05, eta: 11:26:16, time: 0.298, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3932, decode.acc_seg: 84.9196, aux.loss_ce: 0.1699, aux.acc_seg: 83.9653, loss: 0.5631, grad_norm: 5.4165
2023-02-19 05:21:38,514 - mmseg - INFO - Iter [16300/160000]	lr: 5.389e-05, eta: 11:25:57, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4176, decode.acc_seg: 84.3127, aux.loss_ce: 0.1780, aux.acc_seg: 83.3788, loss: 0.5956, grad_norm: 4.9371
2023-02-19 05:21:52,628 - mmseg - INFO - Iter [16350/160000]	lr: 5.387e-05, eta: 11:25:41, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4107, decode.acc_seg: 83.9702, aux.loss_ce: 0.1748, aux.acc_seg: 83.3964, loss: 0.5854, grad_norm: 4.6784
2023-02-19 05:22:06,527 - mmseg - INFO - Iter [16400/160000]	lr: 5.385e-05, eta: 11:25:23, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3893, decode.acc_seg: 84.8782, aux.loss_ce: 0.1691, aux.acc_seg: 83.7736, loss: 0.5584, grad_norm: 4.7088
2023-02-19 05:22:22,386 - mmseg - INFO - Iter [16450/160000]	lr: 5.383e-05, eta: 11:25:22, time: 0.317, data_time: 0.047, memory: 15214, decode.loss_ce: 0.3905, decode.acc_seg: 84.0819, aux.loss_ce: 0.1703, aux.acc_seg: 83.2939, loss: 0.5609, grad_norm: 5.1346
2023-02-19 05:22:36,337 - mmseg - INFO - Iter [16500/160000]	lr: 5.381e-05, eta: 11:25:04, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3744, decode.acc_seg: 85.9239, aux.loss_ce: 0.1640, aux.acc_seg: 84.6088, loss: 0.5384, grad_norm: 3.6760
2023-02-19 05:22:50,435 - mmseg - INFO - Iter [16550/160000]	lr: 5.379e-05, eta: 11:24:48, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3827, decode.acc_seg: 85.4144, aux.loss_ce: 0.1662, aux.acc_seg: 84.4811, loss: 0.5489, grad_norm: 3.9124
2023-02-19 05:23:04,322 - mmseg - INFO - Iter [16600/160000]	lr: 5.378e-05, eta: 11:24:30, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3678, decode.acc_seg: 85.9309, aux.loss_ce: 0.1612, aux.acc_seg: 84.6310, loss: 0.5291, grad_norm: 4.2727
2023-02-19 05:23:18,504 - mmseg - INFO - Iter [16650/160000]	lr: 5.376e-05, eta: 11:24:14, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3922, decode.acc_seg: 84.2372, aux.loss_ce: 0.1722, aux.acc_seg: 83.2629, loss: 0.5645, grad_norm: 5.1002
2023-02-19 05:23:32,770 - mmseg - INFO - Iter [16700/160000]	lr: 5.374e-05, eta: 11:24:00, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3965, decode.acc_seg: 85.3362, aux.loss_ce: 0.1729, aux.acc_seg: 84.3059, loss: 0.5694, grad_norm: 4.6239
2023-02-19 05:23:46,689 - mmseg - INFO - Iter [16750/160000]	lr: 5.372e-05, eta: 11:23:42, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3936, decode.acc_seg: 85.1094, aux.loss_ce: 0.1734, aux.acc_seg: 84.0707, loss: 0.5670, grad_norm: 5.3506
2023-02-19 05:24:00,601 - mmseg - INFO - Iter [16800/160000]	lr: 5.370e-05, eta: 11:23:24, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4076, decode.acc_seg: 84.4078, aux.loss_ce: 0.1781, aux.acc_seg: 83.5128, loss: 0.5857, grad_norm: 5.0434
2023-02-19 05:24:14,654 - mmseg - INFO - Iter [16850/160000]	lr: 5.368e-05, eta: 11:23:07, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4073, decode.acc_seg: 84.5957, aux.loss_ce: 0.1735, aux.acc_seg: 83.8632, loss: 0.5808, grad_norm: 5.1095
2023-02-19 05:24:28,500 - mmseg - INFO - Iter [16900/160000]	lr: 5.366e-05, eta: 11:22:49, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3821, decode.acc_seg: 85.6327, aux.loss_ce: 0.1690, aux.acc_seg: 84.4764, loss: 0.5511, grad_norm: 4.6097
2023-02-19 05:24:42,269 - mmseg - INFO - Iter [16950/160000]	lr: 5.364e-05, eta: 11:22:30, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3864, decode.acc_seg: 84.9986, aux.loss_ce: 0.1716, aux.acc_seg: 83.4790, loss: 0.5580, grad_norm: 4.7203
2023-02-19 05:24:56,114 - mmseg - INFO - Saving checkpoint at 17000 iterations
2023-02-19 05:24:59,585 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:24:59,585 - mmseg - INFO - Iter [17000/160000]	lr: 5.363e-05, eta: 11:22:41, time: 0.346, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3824, decode.acc_seg: 85.3449, aux.loss_ce: 0.1648, aux.acc_seg: 84.3908, loss: 0.5473, grad_norm: 4.1165
2023-02-19 05:25:13,252 - mmseg - INFO - Iter [17050/160000]	lr: 5.361e-05, eta: 11:22:21, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4092, decode.acc_seg: 84.4878, aux.loss_ce: 0.1840, aux.acc_seg: 82.8825, loss: 0.5932, grad_norm: 4.4046
2023-02-19 05:25:27,167 - mmseg - INFO - Iter [17100/160000]	lr: 5.359e-05, eta: 11:22:03, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3873, decode.acc_seg: 85.0862, aux.loss_ce: 0.1679, aux.acc_seg: 84.1221, loss: 0.5552, grad_norm: 4.4035
2023-02-19 05:25:41,080 - mmseg - INFO - Iter [17150/160000]	lr: 5.357e-05, eta: 11:21:46, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3917, decode.acc_seg: 85.0392, aux.loss_ce: 0.1704, aux.acc_seg: 84.1110, loss: 0.5621, grad_norm: 4.2512
2023-02-19 05:25:54,951 - mmseg - INFO - Iter [17200/160000]	lr: 5.355e-05, eta: 11:21:28, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4130, decode.acc_seg: 84.6628, aux.loss_ce: 0.1823, aux.acc_seg: 83.5230, loss: 0.5952, grad_norm: 5.2696
2023-02-19 05:26:08,691 - mmseg - INFO - Iter [17250/160000]	lr: 5.353e-05, eta: 11:21:09, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3850, decode.acc_seg: 84.9788, aux.loss_ce: 0.1721, aux.acc_seg: 83.5629, loss: 0.5571, grad_norm: 4.3153
2023-02-19 05:26:22,722 - mmseg - INFO - Iter [17300/160000]	lr: 5.351e-05, eta: 11:20:52, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3875, decode.acc_seg: 84.9324, aux.loss_ce: 0.1708, aux.acc_seg: 83.7572, loss: 0.5584, grad_norm: 5.3424
2023-02-19 05:26:36,927 - mmseg - INFO - Iter [17350/160000]	lr: 5.349e-05, eta: 11:20:37, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3686, decode.acc_seg: 86.2025, aux.loss_ce: 0.1622, aux.acc_seg: 84.9916, loss: 0.5308, grad_norm: 3.6605
2023-02-19 05:26:50,614 - mmseg - INFO - Iter [17400/160000]	lr: 5.348e-05, eta: 11:20:17, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3719, decode.acc_seg: 85.3617, aux.loss_ce: 0.1605, aux.acc_seg: 84.3451, loss: 0.5324, grad_norm: 3.6930
2023-02-19 05:27:04,304 - mmseg - INFO - Iter [17450/160000]	lr: 5.346e-05, eta: 11:19:58, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3748, decode.acc_seg: 85.6049, aux.loss_ce: 0.1671, aux.acc_seg: 84.3041, loss: 0.5419, grad_norm: 4.1057
2023-02-19 05:27:18,137 - mmseg - INFO - Iter [17500/160000]	lr: 5.344e-05, eta: 11:19:40, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.4135, decode.acc_seg: 84.0528, aux.loss_ce: 0.1813, aux.acc_seg: 82.5421, loss: 0.5948, grad_norm: 4.6204
2023-02-19 05:27:31,812 - mmseg - INFO - Iter [17550/160000]	lr: 5.342e-05, eta: 11:19:20, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3877, decode.acc_seg: 85.1393, aux.loss_ce: 0.1688, aux.acc_seg: 84.0199, loss: 0.5565, grad_norm: 4.7121
2023-02-19 05:27:45,386 - mmseg - INFO - Iter [17600/160000]	lr: 5.340e-05, eta: 11:19:00, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3729, decode.acc_seg: 85.5301, aux.loss_ce: 0.1653, aux.acc_seg: 84.3477, loss: 0.5383, grad_norm: 4.7762
2023-02-19 05:27:59,612 - mmseg - INFO - Iter [17650/160000]	lr: 5.338e-05, eta: 11:18:45, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3935, decode.acc_seg: 84.3471, aux.loss_ce: 0.1714, aux.acc_seg: 83.0921, loss: 0.5649, grad_norm: 4.8286
2023-02-19 05:28:15,968 - mmseg - INFO - Iter [17700/160000]	lr: 5.336e-05, eta: 11:18:47, time: 0.326, data_time: 0.048, memory: 15214, decode.loss_ce: 0.3742, decode.acc_seg: 85.0616, aux.loss_ce: 0.1681, aux.acc_seg: 83.8665, loss: 0.5424, grad_norm: 4.4073
2023-02-19 05:28:30,224 - mmseg - INFO - Iter [17750/160000]	lr: 5.334e-05, eta: 11:18:33, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3528, decode.acc_seg: 86.3215, aux.loss_ce: 0.1582, aux.acc_seg: 85.0727, loss: 0.5110, grad_norm: 5.0382
2023-02-19 05:28:44,046 - mmseg - INFO - Iter [17800/160000]	lr: 5.333e-05, eta: 11:18:14, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3871, decode.acc_seg: 84.5994, aux.loss_ce: 0.1695, aux.acc_seg: 83.8803, loss: 0.5566, grad_norm: 4.1564
2023-02-19 05:28:57,879 - mmseg - INFO - Iter [17850/160000]	lr: 5.331e-05, eta: 11:17:56, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3812, decode.acc_seg: 85.5721, aux.loss_ce: 0.1674, aux.acc_seg: 84.3529, loss: 0.5487, grad_norm: 4.3657
2023-02-19 05:29:11,529 - mmseg - INFO - Iter [17900/160000]	lr: 5.329e-05, eta: 11:17:37, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3481, decode.acc_seg: 86.2874, aux.loss_ce: 0.1550, aux.acc_seg: 84.9825, loss: 0.5032, grad_norm: 4.3947
2023-02-19 05:29:25,286 - mmseg - INFO - Iter [17950/160000]	lr: 5.327e-05, eta: 11:17:18, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3702, decode.acc_seg: 85.6321, aux.loss_ce: 0.1683, aux.acc_seg: 83.8866, loss: 0.5385, grad_norm: 4.3307
2023-02-19 05:29:39,119 - mmseg - INFO - Saving checkpoint at 18000 iterations
2023-02-19 05:29:42,361 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:29:42,361 - mmseg - INFO - Iter [18000/160000]	lr: 5.325e-05, eta: 11:17:26, time: 0.342, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3646, decode.acc_seg: 86.0554, aux.loss_ce: 0.1640, aux.acc_seg: 84.5667, loss: 0.5286, grad_norm: 3.9847
2023-02-19 05:29:56,376 - mmseg - INFO - Iter [18050/160000]	lr: 5.323e-05, eta: 11:17:09, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3830, decode.acc_seg: 85.7118, aux.loss_ce: 0.1690, aux.acc_seg: 84.3537, loss: 0.5521, grad_norm: 3.8511
2023-02-19 05:30:10,274 - mmseg - INFO - Iter [18100/160000]	lr: 5.321e-05, eta: 11:16:52, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3682, decode.acc_seg: 85.9570, aux.loss_ce: 0.1632, aux.acc_seg: 84.8096, loss: 0.5314, grad_norm: 3.9077
2023-02-19 05:30:23,883 - mmseg - INFO - Iter [18150/160000]	lr: 5.319e-05, eta: 11:16:32, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.4048, decode.acc_seg: 84.4761, aux.loss_ce: 0.1785, aux.acc_seg: 83.1669, loss: 0.5833, grad_norm: 6.9818
2023-02-19 05:30:37,864 - mmseg - INFO - Iter [18200/160000]	lr: 5.318e-05, eta: 11:16:15, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3687, decode.acc_seg: 85.7844, aux.loss_ce: 0.1601, aux.acc_seg: 84.6417, loss: 0.5288, grad_norm: 3.7427
2023-02-19 05:30:52,180 - mmseg - INFO - Iter [18250/160000]	lr: 5.316e-05, eta: 11:16:00, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3470, decode.acc_seg: 86.2802, aux.loss_ce: 0.1556, aux.acc_seg: 84.9658, loss: 0.5026, grad_norm: 3.7763
2023-02-19 05:31:05,856 - mmseg - INFO - Iter [18300/160000]	lr: 5.314e-05, eta: 11:15:41, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3741, decode.acc_seg: 85.3977, aux.loss_ce: 0.1637, aux.acc_seg: 84.1638, loss: 0.5377, grad_norm: 4.7529
2023-02-19 05:31:20,004 - mmseg - INFO - Iter [18350/160000]	lr: 5.312e-05, eta: 11:15:26, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3609, decode.acc_seg: 85.7920, aux.loss_ce: 0.1628, aux.acc_seg: 84.3129, loss: 0.5237, grad_norm: 4.2461
2023-02-19 05:31:33,925 - mmseg - INFO - Iter [18400/160000]	lr: 5.310e-05, eta: 11:15:09, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3477, decode.acc_seg: 86.2838, aux.loss_ce: 0.1537, aux.acc_seg: 84.8506, loss: 0.5015, grad_norm: 3.8673
2023-02-19 05:31:47,984 - mmseg - INFO - Iter [18450/160000]	lr: 5.308e-05, eta: 11:14:52, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3616, decode.acc_seg: 86.1949, aux.loss_ce: 0.1618, aux.acc_seg: 84.8283, loss: 0.5234, grad_norm: 4.3892
2023-02-19 05:32:02,375 - mmseg - INFO - Iter [18500/160000]	lr: 5.306e-05, eta: 11:14:39, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3528, decode.acc_seg: 86.2769, aux.loss_ce: 0.1570, aux.acc_seg: 84.7554, loss: 0.5099, grad_norm: 4.2773
2023-02-19 05:32:16,347 - mmseg - INFO - Iter [18550/160000]	lr: 5.304e-05, eta: 11:14:22, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3695, decode.acc_seg: 86.0565, aux.loss_ce: 0.1695, aux.acc_seg: 84.3461, loss: 0.5390, grad_norm: 4.2754
2023-02-19 05:32:30,409 - mmseg - INFO - Iter [18600/160000]	lr: 5.303e-05, eta: 11:14:06, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3915, decode.acc_seg: 84.6894, aux.loss_ce: 0.1739, aux.acc_seg: 83.3134, loss: 0.5654, grad_norm: 4.8469
2023-02-19 05:32:43,947 - mmseg - INFO - Iter [18650/160000]	lr: 5.301e-05, eta: 11:13:46, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3530, decode.acc_seg: 86.3640, aux.loss_ce: 0.1619, aux.acc_seg: 84.5694, loss: 0.5149, grad_norm: 4.6627
2023-02-19 05:32:57,998 - mmseg - INFO - Iter [18700/160000]	lr: 5.299e-05, eta: 11:13:29, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3449, decode.acc_seg: 86.7981, aux.loss_ce: 0.1523, aux.acc_seg: 85.6186, loss: 0.4972, grad_norm: 3.4146
2023-02-19 05:33:11,942 - mmseg - INFO - Iter [18750/160000]	lr: 5.297e-05, eta: 11:13:13, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3935, decode.acc_seg: 85.1959, aux.loss_ce: 0.1702, aux.acc_seg: 83.9066, loss: 0.5637, grad_norm: 5.3216
2023-02-19 05:33:25,510 - mmseg - INFO - Iter [18800/160000]	lr: 5.295e-05, eta: 11:12:53, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3653, decode.acc_seg: 85.5405, aux.loss_ce: 0.1665, aux.acc_seg: 83.9496, loss: 0.5318, grad_norm: 3.9192
2023-02-19 05:33:39,102 - mmseg - INFO - Iter [18850/160000]	lr: 5.293e-05, eta: 11:12:33, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3636, decode.acc_seg: 86.1767, aux.loss_ce: 0.1626, aux.acc_seg: 84.4155, loss: 0.5262, grad_norm: 3.7961
2023-02-19 05:33:53,109 - mmseg - INFO - Iter [18900/160000]	lr: 5.291e-05, eta: 11:12:17, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3836, decode.acc_seg: 85.1789, aux.loss_ce: 0.1710, aux.acc_seg: 83.3376, loss: 0.5545, grad_norm: 3.8725
2023-02-19 05:34:09,193 - mmseg - INFO - Iter [18950/160000]	lr: 5.289e-05, eta: 11:12:16, time: 0.322, data_time: 0.048, memory: 15214, decode.loss_ce: 0.3649, decode.acc_seg: 85.7218, aux.loss_ce: 0.1603, aux.acc_seg: 84.4121, loss: 0.5252, grad_norm: 4.0353
2023-02-19 05:34:23,008 - mmseg - INFO - Saving checkpoint at 19000 iterations
2023-02-19 05:34:26,256 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:34:26,256 - mmseg - INFO - Iter [19000/160000]	lr: 5.288e-05, eta: 11:12:22, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3690, decode.acc_seg: 86.1577, aux.loss_ce: 0.1673, aux.acc_seg: 84.4122, loss: 0.5364, grad_norm: 4.8982
2023-02-19 05:34:40,369 - mmseg - INFO - Iter [19050/160000]	lr: 5.286e-05, eta: 11:12:06, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3305, decode.acc_seg: 87.5443, aux.loss_ce: 0.1486, aux.acc_seg: 85.9806, loss: 0.4791, grad_norm: 3.5995
2023-02-19 05:34:54,020 - mmseg - INFO - Iter [19100/160000]	lr: 5.284e-05, eta: 11:11:47, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3438, decode.acc_seg: 86.7511, aux.loss_ce: 0.1562, aux.acc_seg: 85.4613, loss: 0.5000, grad_norm: 4.0181
2023-02-19 05:35:08,595 - mmseg - INFO - Iter [19150/160000]	lr: 5.282e-05, eta: 11:11:35, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3687, decode.acc_seg: 85.9921, aux.loss_ce: 0.1649, aux.acc_seg: 84.3420, loss: 0.5336, grad_norm: 4.1262
2023-02-19 05:35:22,174 - mmseg - INFO - Iter [19200/160000]	lr: 5.280e-05, eta: 11:11:15, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3567, decode.acc_seg: 86.5963, aux.loss_ce: 0.1594, aux.acc_seg: 85.0855, loss: 0.5161, grad_norm: 4.0485
2023-02-19 05:35:35,778 - mmseg - INFO - Iter [19250/160000]	lr: 5.278e-05, eta: 11:10:56, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3562, decode.acc_seg: 86.3390, aux.loss_ce: 0.1609, aux.acc_seg: 84.8193, loss: 0.5171, grad_norm: 4.8340
2023-02-19 05:35:49,951 - mmseg - INFO - Iter [19300/160000]	lr: 5.276e-05, eta: 11:10:41, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3449, decode.acc_seg: 86.7736, aux.loss_ce: 0.1579, aux.acc_seg: 84.9283, loss: 0.5028, grad_norm: 3.7338
2023-02-19 05:36:03,895 - mmseg - INFO - Iter [19350/160000]	lr: 5.274e-05, eta: 11:10:24, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3458, decode.acc_seg: 86.4754, aux.loss_ce: 0.1590, aux.acc_seg: 84.7799, loss: 0.5049, grad_norm: 3.4200
2023-02-19 05:36:18,128 - mmseg - INFO - Iter [19400/160000]	lr: 5.273e-05, eta: 11:10:09, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3730, decode.acc_seg: 85.6584, aux.loss_ce: 0.1693, aux.acc_seg: 83.9545, loss: 0.5423, grad_norm: 4.6544
2023-02-19 05:36:31,779 - mmseg - INFO - Iter [19450/160000]	lr: 5.271e-05, eta: 11:09:50, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3372, decode.acc_seg: 86.9137, aux.loss_ce: 0.1539, aux.acc_seg: 85.4482, loss: 0.4911, grad_norm: 3.7515
2023-02-19 05:36:45,413 - mmseg - INFO - Iter [19500/160000]	lr: 5.269e-05, eta: 11:09:31, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3589, decode.acc_seg: 86.5032, aux.loss_ce: 0.1625, aux.acc_seg: 84.8005, loss: 0.5214, grad_norm: 4.7385
2023-02-19 05:36:59,415 - mmseg - INFO - Iter [19550/160000]	lr: 5.267e-05, eta: 11:09:14, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3493, decode.acc_seg: 86.3534, aux.loss_ce: 0.1607, aux.acc_seg: 84.6246, loss: 0.5101, grad_norm: 5.4094
2023-02-19 05:37:13,202 - mmseg - INFO - Iter [19600/160000]	lr: 5.265e-05, eta: 11:08:56, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3680, decode.acc_seg: 85.8004, aux.loss_ce: 0.1619, aux.acc_seg: 84.6744, loss: 0.5299, grad_norm: 3.8993
2023-02-19 05:37:27,403 - mmseg - INFO - Iter [19650/160000]	lr: 5.263e-05, eta: 11:08:41, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3418, decode.acc_seg: 86.7071, aux.loss_ce: 0.1590, aux.acc_seg: 84.9434, loss: 0.5008, grad_norm: 4.8896
2023-02-19 05:37:42,183 - mmseg - INFO - Iter [19700/160000]	lr: 5.261e-05, eta: 11:08:31, time: 0.296, data_time: 0.006, memory: 15214, decode.loss_ce: 0.3488, decode.acc_seg: 86.6397, aux.loss_ce: 0.1569, aux.acc_seg: 85.0378, loss: 0.5057, grad_norm: 3.7701
2023-02-19 05:37:56,046 - mmseg - INFO - Iter [19750/160000]	lr: 5.259e-05, eta: 11:08:13, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3379, decode.acc_seg: 86.6622, aux.loss_ce: 0.1565, aux.acc_seg: 84.7396, loss: 0.4944, grad_norm: 3.5617
2023-02-19 05:38:09,932 - mmseg - INFO - Iter [19800/160000]	lr: 5.258e-05, eta: 11:07:56, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3502, decode.acc_seg: 86.4317, aux.loss_ce: 0.1595, aux.acc_seg: 84.8474, loss: 0.5096, grad_norm: 4.7329
2023-02-19 05:38:24,126 - mmseg - INFO - Iter [19850/160000]	lr: 5.256e-05, eta: 11:07:41, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3661, decode.acc_seg: 86.2270, aux.loss_ce: 0.1652, aux.acc_seg: 84.5238, loss: 0.5313, grad_norm: 4.2221
2023-02-19 05:38:37,869 - mmseg - INFO - Iter [19900/160000]	lr: 5.254e-05, eta: 11:07:23, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3577, decode.acc_seg: 85.8945, aux.loss_ce: 0.1659, aux.acc_seg: 84.0410, loss: 0.5236, grad_norm: 4.4948
2023-02-19 05:38:52,100 - mmseg - INFO - Iter [19950/160000]	lr: 5.252e-05, eta: 11:07:08, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3423, decode.acc_seg: 86.9810, aux.loss_ce: 0.1612, aux.acc_seg: 84.8663, loss: 0.5035, grad_norm: 3.6721
2023-02-19 05:39:06,016 - mmseg - INFO - Saving checkpoint at 20000 iterations
2023-02-19 05:39:09,258 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:39:09,259 - mmseg - INFO - Iter [20000/160000]	lr: 5.250e-05, eta: 11:07:14, time: 0.343, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3322, decode.acc_seg: 87.0258, aux.loss_ce: 0.1542, aux.acc_seg: 85.1213, loss: 0.4864, grad_norm: 4.0671
2023-02-19 05:39:22,858 - mmseg - INFO - Iter [20050/160000]	lr: 5.248e-05, eta: 11:06:55, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3459, decode.acc_seg: 86.5374, aux.loss_ce: 0.1576, aux.acc_seg: 84.7928, loss: 0.5036, grad_norm: 3.8842
2023-02-19 05:39:37,000 - mmseg - INFO - Iter [20100/160000]	lr: 5.246e-05, eta: 11:06:40, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3612, decode.acc_seg: 86.0150, aux.loss_ce: 0.1654, aux.acc_seg: 84.2668, loss: 0.5266, grad_norm: 3.9351
2023-02-19 05:39:50,795 - mmseg - INFO - Iter [20150/160000]	lr: 5.244e-05, eta: 11:06:22, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3451, decode.acc_seg: 86.8191, aux.loss_ce: 0.1564, aux.acc_seg: 85.1269, loss: 0.5015, grad_norm: 3.6163
2023-02-19 05:40:06,033 - mmseg - INFO - Iter [20200/160000]	lr: 5.243e-05, eta: 11:06:14, time: 0.305, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3648, decode.acc_seg: 86.2033, aux.loss_ce: 0.1667, aux.acc_seg: 84.5989, loss: 0.5315, grad_norm: 3.9422
2023-02-19 05:40:21,893 - mmseg - INFO - Iter [20250/160000]	lr: 5.241e-05, eta: 11:06:10, time: 0.317, data_time: 0.047, memory: 15214, decode.loss_ce: 0.3489, decode.acc_seg: 86.6558, aux.loss_ce: 0.1604, aux.acc_seg: 84.9357, loss: 0.5094, grad_norm: 4.9558
2023-02-19 05:40:35,613 - mmseg - INFO - Iter [20300/160000]	lr: 5.239e-05, eta: 11:05:52, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3477, decode.acc_seg: 86.3466, aux.loss_ce: 0.1609, aux.acc_seg: 84.6601, loss: 0.5086, grad_norm: 4.7178
2023-02-19 05:40:49,175 - mmseg - INFO - Iter [20350/160000]	lr: 5.237e-05, eta: 11:05:33, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3159, decode.acc_seg: 87.7124, aux.loss_ce: 0.1468, aux.acc_seg: 85.8811, loss: 0.4627, grad_norm: 3.5771
2023-02-19 05:41:02,960 - mmseg - INFO - Iter [20400/160000]	lr: 5.235e-05, eta: 11:05:15, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3605, decode.acc_seg: 86.2723, aux.loss_ce: 0.1601, aux.acc_seg: 84.7973, loss: 0.5206, grad_norm: 4.3274
2023-02-19 05:41:17,006 - mmseg - INFO - Iter [20450/160000]	lr: 5.233e-05, eta: 11:04:59, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3249, decode.acc_seg: 86.8716, aux.loss_ce: 0.1458, aux.acc_seg: 85.4284, loss: 0.4707, grad_norm: 3.3001
2023-02-19 05:41:30,798 - mmseg - INFO - Iter [20500/160000]	lr: 5.231e-05, eta: 11:04:41, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3554, decode.acc_seg: 86.0430, aux.loss_ce: 0.1627, aux.acc_seg: 84.1207, loss: 0.5181, grad_norm: 4.5884
2023-02-19 05:41:44,448 - mmseg - INFO - Iter [20550/160000]	lr: 5.229e-05, eta: 11:04:23, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3376, decode.acc_seg: 87.0077, aux.loss_ce: 0.1531, aux.acc_seg: 85.6466, loss: 0.4907, grad_norm: 3.7909
2023-02-19 05:41:59,847 - mmseg - INFO - Iter [20600/160000]	lr: 5.228e-05, eta: 11:04:16, time: 0.307, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3537, decode.acc_seg: 86.4712, aux.loss_ce: 0.1646, aux.acc_seg: 84.6959, loss: 0.5183, grad_norm: 4.7331
2023-02-19 05:42:13,830 - mmseg - INFO - Iter [20650/160000]	lr: 5.226e-05, eta: 11:03:59, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3401, decode.acc_seg: 86.9433, aux.loss_ce: 0.1574, aux.acc_seg: 84.9981, loss: 0.4975, grad_norm: 4.0483
2023-02-19 05:42:27,981 - mmseg - INFO - Iter [20700/160000]	lr: 5.224e-05, eta: 11:03:44, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3334, decode.acc_seg: 86.9581, aux.loss_ce: 0.1499, aux.acc_seg: 85.6918, loss: 0.4833, grad_norm: 3.9678
2023-02-19 05:42:41,720 - mmseg - INFO - Iter [20750/160000]	lr: 5.222e-05, eta: 11:03:26, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3392, decode.acc_seg: 86.9501, aux.loss_ce: 0.1543, aux.acc_seg: 85.3644, loss: 0.4935, grad_norm: 4.3770
2023-02-19 05:42:55,553 - mmseg - INFO - Iter [20800/160000]	lr: 5.220e-05, eta: 11:03:09, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3432, decode.acc_seg: 86.8284, aux.loss_ce: 0.1546, aux.acc_seg: 85.1145, loss: 0.4978, grad_norm: 4.6336
2023-02-19 05:43:10,088 - mmseg - INFO - Iter [20850/160000]	lr: 5.218e-05, eta: 11:02:56, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3251, decode.acc_seg: 87.5362, aux.loss_ce: 0.1516, aux.acc_seg: 85.6900, loss: 0.4767, grad_norm: 4.4123
2023-02-19 05:43:24,064 - mmseg - INFO - Iter [20900/160000]	lr: 5.216e-05, eta: 11:02:40, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3621, decode.acc_seg: 86.3906, aux.loss_ce: 0.1630, aux.acc_seg: 84.7248, loss: 0.5251, grad_norm: 4.9514
2023-02-19 05:43:38,172 - mmseg - INFO - Iter [20950/160000]	lr: 5.214e-05, eta: 11:02:24, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3426, decode.acc_seg: 86.9567, aux.loss_ce: 0.1565, aux.acc_seg: 85.1384, loss: 0.4991, grad_norm: 4.0968
2023-02-19 05:43:51,793 - mmseg - INFO - Saving checkpoint at 21000 iterations
2023-02-19 05:43:55,026 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:43:55,026 - mmseg - INFO - Iter [21000/160000]	lr: 5.213e-05, eta: 11:02:27, time: 0.337, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3370, decode.acc_seg: 86.8787, aux.loss_ce: 0.1536, aux.acc_seg: 85.3567, loss: 0.4906, grad_norm: 3.7794
2023-02-19 05:44:09,079 - mmseg - INFO - Iter [21050/160000]	lr: 5.211e-05, eta: 11:02:11, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3429, decode.acc_seg: 86.5244, aux.loss_ce: 0.1569, aux.acc_seg: 84.7705, loss: 0.4998, grad_norm: 4.6056
2023-02-19 05:44:22,885 - mmseg - INFO - Iter [21100/160000]	lr: 5.209e-05, eta: 11:01:53, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3271, decode.acc_seg: 87.3956, aux.loss_ce: 0.1502, aux.acc_seg: 85.5477, loss: 0.4773, grad_norm: 3.4366
2023-02-19 05:44:36,647 - mmseg - INFO - Iter [21150/160000]	lr: 5.207e-05, eta: 11:01:36, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3338, decode.acc_seg: 87.1859, aux.loss_ce: 0.1518, aux.acc_seg: 85.6915, loss: 0.4856, grad_norm: 3.4045
2023-02-19 05:44:50,355 - mmseg - INFO - Iter [21200/160000]	lr: 5.205e-05, eta: 11:01:17, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3289, decode.acc_seg: 87.2143, aux.loss_ce: 0.1496, aux.acc_seg: 85.6499, loss: 0.4785, grad_norm: 4.1089
2023-02-19 05:45:04,129 - mmseg - INFO - Iter [21250/160000]	lr: 5.203e-05, eta: 11:01:00, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3216, decode.acc_seg: 87.5572, aux.loss_ce: 0.1497, aux.acc_seg: 85.6089, loss: 0.4713, grad_norm: 3.3243
2023-02-19 05:45:18,355 - mmseg - INFO - Iter [21300/160000]	lr: 5.201e-05, eta: 11:00:45, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3272, decode.acc_seg: 87.5488, aux.loss_ce: 0.1513, aux.acc_seg: 85.6605, loss: 0.4785, grad_norm: 3.9140
2023-02-19 05:45:32,104 - mmseg - INFO - Iter [21350/160000]	lr: 5.199e-05, eta: 11:00:27, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3649, decode.acc_seg: 86.3356, aux.loss_ce: 0.1661, aux.acc_seg: 84.2808, loss: 0.5310, grad_norm: 4.3844
2023-02-19 05:45:46,523 - mmseg - INFO - Iter [21400/160000]	lr: 5.198e-05, eta: 11:00:14, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3585, decode.acc_seg: 86.2964, aux.loss_ce: 0.1657, aux.acc_seg: 84.6081, loss: 0.5242, grad_norm: 4.2923
2023-02-19 05:46:00,243 - mmseg - INFO - Iter [21450/160000]	lr: 5.196e-05, eta: 10:59:56, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3376, decode.acc_seg: 86.6707, aux.loss_ce: 0.1591, aux.acc_seg: 84.5475, loss: 0.4967, grad_norm: 3.9056
2023-02-19 05:46:16,186 - mmseg - INFO - Iter [21500/160000]	lr: 5.194e-05, eta: 10:59:52, time: 0.319, data_time: 0.048, memory: 15214, decode.loss_ce: 0.3442, decode.acc_seg: 86.7042, aux.loss_ce: 0.1585, aux.acc_seg: 84.6673, loss: 0.5027, grad_norm: 4.4151
2023-02-19 05:46:30,175 - mmseg - INFO - Iter [21550/160000]	lr: 5.192e-05, eta: 10:59:36, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3409, decode.acc_seg: 86.7879, aux.loss_ce: 0.1550, aux.acc_seg: 85.1294, loss: 0.4959, grad_norm: 4.1370
2023-02-19 05:46:44,088 - mmseg - INFO - Iter [21600/160000]	lr: 5.190e-05, eta: 10:59:19, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3226, decode.acc_seg: 87.8730, aux.loss_ce: 0.1520, aux.acc_seg: 85.8973, loss: 0.4746, grad_norm: 3.9512
2023-02-19 05:46:58,204 - mmseg - INFO - Iter [21650/160000]	lr: 5.188e-05, eta: 10:59:04, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3336, decode.acc_seg: 87.5803, aux.loss_ce: 0.1573, aux.acc_seg: 85.2310, loss: 0.4909, grad_norm: 3.8669
2023-02-19 05:47:12,178 - mmseg - INFO - Iter [21700/160000]	lr: 5.186e-05, eta: 10:58:47, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3341, decode.acc_seg: 87.1727, aux.loss_ce: 0.1508, aux.acc_seg: 85.3895, loss: 0.4849, grad_norm: 3.7802
2023-02-19 05:47:26,079 - mmseg - INFO - Iter [21750/160000]	lr: 5.184e-05, eta: 10:58:31, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3312, decode.acc_seg: 87.2565, aux.loss_ce: 0.1514, aux.acc_seg: 85.6660, loss: 0.4826, grad_norm: 4.2872
2023-02-19 05:47:39,866 - mmseg - INFO - Iter [21800/160000]	lr: 5.183e-05, eta: 10:58:13, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3276, decode.acc_seg: 87.3986, aux.loss_ce: 0.1494, aux.acc_seg: 85.8601, loss: 0.4769, grad_norm: 4.0500
2023-02-19 05:47:53,516 - mmseg - INFO - Iter [21850/160000]	lr: 5.181e-05, eta: 10:57:55, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3447, decode.acc_seg: 86.9702, aux.loss_ce: 0.1588, aux.acc_seg: 85.0795, loss: 0.5035, grad_norm: 4.3974
2023-02-19 05:48:07,405 - mmseg - INFO - Iter [21900/160000]	lr: 5.179e-05, eta: 10:57:38, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3292, decode.acc_seg: 87.0318, aux.loss_ce: 0.1509, aux.acc_seg: 85.5124, loss: 0.4801, grad_norm: 3.7424
2023-02-19 05:48:21,460 - mmseg - INFO - Iter [21950/160000]	lr: 5.177e-05, eta: 10:57:22, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3339, decode.acc_seg: 87.0034, aux.loss_ce: 0.1548, aux.acc_seg: 85.2642, loss: 0.4887, grad_norm: 3.9564
2023-02-19 05:48:35,865 - mmseg - INFO - Saving checkpoint at 22000 iterations
2023-02-19 05:48:39,098 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:48:39,098 - mmseg - INFO - Iter [22000/160000]	lr: 5.175e-05, eta: 10:57:29, time: 0.353, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3513, decode.acc_seg: 86.3372, aux.loss_ce: 0.1598, aux.acc_seg: 84.7016, loss: 0.5111, grad_norm: 4.9649
2023-02-19 05:48:53,083 - mmseg - INFO - Iter [22050/160000]	lr: 5.173e-05, eta: 10:57:13, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3398, decode.acc_seg: 87.0932, aux.loss_ce: 0.1534, aux.acc_seg: 85.3267, loss: 0.4932, grad_norm: 3.7355
2023-02-19 05:49:07,258 - mmseg - INFO - Iter [22100/160000]	lr: 5.171e-05, eta: 10:56:58, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3209, decode.acc_seg: 87.5568, aux.loss_ce: 0.1487, aux.acc_seg: 85.4532, loss: 0.4696, grad_norm: 3.4240
2023-02-19 05:49:21,004 - mmseg - INFO - Iter [22150/160000]	lr: 5.169e-05, eta: 10:56:40, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3533, decode.acc_seg: 86.4578, aux.loss_ce: 0.1597, aux.acc_seg: 84.6003, loss: 0.5129, grad_norm: 4.5052
2023-02-19 05:49:34,931 - mmseg - INFO - Iter [22200/160000]	lr: 5.168e-05, eta: 10:56:24, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3132, decode.acc_seg: 87.9764, aux.loss_ce: 0.1448, aux.acc_seg: 86.0326, loss: 0.4580, grad_norm: 3.8481
2023-02-19 05:49:49,034 - mmseg - INFO - Iter [22250/160000]	lr: 5.166e-05, eta: 10:56:08, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3052, decode.acc_seg: 87.8133, aux.loss_ce: 0.1427, aux.acc_seg: 85.9358, loss: 0.4479, grad_norm: 3.4426
2023-02-19 05:50:02,905 - mmseg - INFO - Iter [22300/160000]	lr: 5.164e-05, eta: 10:55:51, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3486, decode.acc_seg: 86.4473, aux.loss_ce: 0.1611, aux.acc_seg: 84.3115, loss: 0.5097, grad_norm: 4.5238
2023-02-19 05:50:16,641 - mmseg - INFO - Iter [22350/160000]	lr: 5.162e-05, eta: 10:55:34, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3157, decode.acc_seg: 87.6192, aux.loss_ce: 0.1493, aux.acc_seg: 85.4397, loss: 0.4650, grad_norm: 4.0093
2023-02-19 05:50:30,407 - mmseg - INFO - Iter [22400/160000]	lr: 5.160e-05, eta: 10:55:16, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3311, decode.acc_seg: 87.2200, aux.loss_ce: 0.1521, aux.acc_seg: 85.4204, loss: 0.4832, grad_norm: 3.6653
2023-02-19 05:50:44,008 - mmseg - INFO - Iter [22450/160000]	lr: 5.158e-05, eta: 10:54:58, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3209, decode.acc_seg: 87.8838, aux.loss_ce: 0.1485, aux.acc_seg: 86.0815, loss: 0.4694, grad_norm: 3.8547
2023-02-19 05:50:57,851 - mmseg - INFO - Iter [22500/160000]	lr: 5.156e-05, eta: 10:54:40, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3247, decode.acc_seg: 87.5908, aux.loss_ce: 0.1529, aux.acc_seg: 85.4578, loss: 0.4776, grad_norm: 3.7429
2023-02-19 05:51:11,755 - mmseg - INFO - Iter [22550/160000]	lr: 5.154e-05, eta: 10:54:24, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3451, decode.acc_seg: 86.5587, aux.loss_ce: 0.1606, aux.acc_seg: 84.7239, loss: 0.5057, grad_norm: 3.4094
2023-02-19 05:51:25,531 - mmseg - INFO - Iter [22600/160000]	lr: 5.153e-05, eta: 10:54:07, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3261, decode.acc_seg: 87.6047, aux.loss_ce: 0.1481, aux.acc_seg: 85.9284, loss: 0.4742, grad_norm: 3.9543
2023-02-19 05:51:40,775 - mmseg - INFO - Iter [22650/160000]	lr: 5.151e-05, eta: 10:53:58, time: 0.304, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3454, decode.acc_seg: 86.4461, aux.loss_ce: 0.1572, aux.acc_seg: 84.6269, loss: 0.5026, grad_norm: 4.5764
2023-02-19 05:51:54,636 - mmseg - INFO - Iter [22700/160000]	lr: 5.149e-05, eta: 10:53:41, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3067, decode.acc_seg: 88.0803, aux.loss_ce: 0.1437, aux.acc_seg: 85.9634, loss: 0.4504, grad_norm: 4.2827
2023-02-19 05:52:11,235 - mmseg - INFO - Iter [22750/160000]	lr: 5.147e-05, eta: 10:53:41, time: 0.333, data_time: 0.050, memory: 15214, decode.loss_ce: 0.3156, decode.acc_seg: 87.8166, aux.loss_ce: 0.1462, aux.acc_seg: 85.7219, loss: 0.4618, grad_norm: 3.7322
2023-02-19 05:52:26,176 - mmseg - INFO - Iter [22800/160000]	lr: 5.145e-05, eta: 10:53:30, time: 0.298, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3212, decode.acc_seg: 87.7748, aux.loss_ce: 0.1467, aux.acc_seg: 86.1720, loss: 0.4678, grad_norm: 3.5086
2023-02-19 05:52:39,922 - mmseg - INFO - Iter [22850/160000]	lr: 5.143e-05, eta: 10:53:13, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3175, decode.acc_seg: 87.7910, aux.loss_ce: 0.1481, aux.acc_seg: 85.9954, loss: 0.4656, grad_norm: 4.3809
2023-02-19 05:52:53,744 - mmseg - INFO - Iter [22900/160000]	lr: 5.141e-05, eta: 10:52:56, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2991, decode.acc_seg: 88.1354, aux.loss_ce: 0.1437, aux.acc_seg: 86.0526, loss: 0.4428, grad_norm: 4.2498
2023-02-19 05:53:08,023 - mmseg - INFO - Iter [22950/160000]	lr: 5.139e-05, eta: 10:52:42, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3350, decode.acc_seg: 86.9716, aux.loss_ce: 0.1563, aux.acc_seg: 84.7971, loss: 0.4914, grad_norm: 4.4256
2023-02-19 05:53:21,644 - mmseg - INFO - Saving checkpoint at 23000 iterations
2023-02-19 05:53:24,879 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:53:24,879 - mmseg - INFO - Iter [23000/160000]	lr: 5.138e-05, eta: 10:52:43, time: 0.337, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3175, decode.acc_seg: 87.9090, aux.loss_ce: 0.1495, aux.acc_seg: 85.7227, loss: 0.4670, grad_norm: 3.5767
2023-02-19 05:53:39,472 - mmseg - INFO - Iter [23050/160000]	lr: 5.136e-05, eta: 10:52:30, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2999, decode.acc_seg: 88.2585, aux.loss_ce: 0.1403, aux.acc_seg: 86.3028, loss: 0.4401, grad_norm: 4.1882
2023-02-19 05:53:53,262 - mmseg - INFO - Iter [23100/160000]	lr: 5.134e-05, eta: 10:52:13, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3174, decode.acc_seg: 87.7368, aux.loss_ce: 0.1503, aux.acc_seg: 85.5006, loss: 0.4677, grad_norm: 3.9823
2023-02-19 05:54:07,442 - mmseg - INFO - Iter [23150/160000]	lr: 5.132e-05, eta: 10:51:58, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3111, decode.acc_seg: 87.8463, aux.loss_ce: 0.1449, aux.acc_seg: 85.9522, loss: 0.4560, grad_norm: 3.6156
2023-02-19 05:54:21,833 - mmseg - INFO - Iter [23200/160000]	lr: 5.130e-05, eta: 10:51:44, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3016, decode.acc_seg: 87.9871, aux.loss_ce: 0.1412, aux.acc_seg: 86.1249, loss: 0.4428, grad_norm: 3.4309
2023-02-19 05:54:35,939 - mmseg - INFO - Iter [23250/160000]	lr: 5.128e-05, eta: 10:51:29, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3173, decode.acc_seg: 87.6354, aux.loss_ce: 0.1471, aux.acc_seg: 85.7136, loss: 0.4645, grad_norm: 3.9934
2023-02-19 05:54:50,024 - mmseg - INFO - Iter [23300/160000]	lr: 5.126e-05, eta: 10:51:13, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3004, decode.acc_seg: 88.1366, aux.loss_ce: 0.1408, aux.acc_seg: 86.4488, loss: 0.4412, grad_norm: 3.5490
2023-02-19 05:55:03,578 - mmseg - INFO - Iter [23350/160000]	lr: 5.124e-05, eta: 10:50:55, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3203, decode.acc_seg: 87.6712, aux.loss_ce: 0.1509, aux.acc_seg: 85.6373, loss: 0.4711, grad_norm: 4.2366
2023-02-19 05:55:17,558 - mmseg - INFO - Iter [23400/160000]	lr: 5.123e-05, eta: 10:50:39, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3169, decode.acc_seg: 87.5357, aux.loss_ce: 0.1505, aux.acc_seg: 85.6038, loss: 0.4675, grad_norm: 3.5896
2023-02-19 05:55:32,086 - mmseg - INFO - Iter [23450/160000]	lr: 5.121e-05, eta: 10:50:26, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3152, decode.acc_seg: 87.6815, aux.loss_ce: 0.1482, aux.acc_seg: 85.6895, loss: 0.4633, grad_norm: 3.5280
2023-02-19 05:55:45,815 - mmseg - INFO - Iter [23500/160000]	lr: 5.119e-05, eta: 10:50:08, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2952, decode.acc_seg: 87.7861, aux.loss_ce: 0.1371, aux.acc_seg: 86.1284, loss: 0.4323, grad_norm: 3.5162
2023-02-19 05:55:59,718 - mmseg - INFO - Iter [23550/160000]	lr: 5.117e-05, eta: 10:49:51, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3231, decode.acc_seg: 87.5639, aux.loss_ce: 0.1516, aux.acc_seg: 85.5151, loss: 0.4747, grad_norm: 4.3141
2023-02-19 05:56:13,524 - mmseg - INFO - Iter [23600/160000]	lr: 5.115e-05, eta: 10:49:35, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3244, decode.acc_seg: 87.5067, aux.loss_ce: 0.1481, aux.acc_seg: 85.7085, loss: 0.4725, grad_norm: 3.5734
2023-02-19 05:56:27,386 - mmseg - INFO - Iter [23650/160000]	lr: 5.113e-05, eta: 10:49:18, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3258, decode.acc_seg: 87.2837, aux.loss_ce: 0.1530, aux.acc_seg: 85.0396, loss: 0.4788, grad_norm: 3.5541
2023-02-19 05:56:41,289 - mmseg - INFO - Iter [23700/160000]	lr: 5.111e-05, eta: 10:49:01, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3280, decode.acc_seg: 87.3280, aux.loss_ce: 0.1508, aux.acc_seg: 85.5055, loss: 0.4788, grad_norm: 4.1032
2023-02-19 05:56:54,933 - mmseg - INFO - Iter [23750/160000]	lr: 5.109e-05, eta: 10:48:43, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3377, decode.acc_seg: 86.8293, aux.loss_ce: 0.1567, aux.acc_seg: 84.6997, loss: 0.4944, grad_norm: 4.3352
2023-02-19 05:57:09,048 - mmseg - INFO - Iter [23800/160000]	lr: 5.108e-05, eta: 10:48:28, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3061, decode.acc_seg: 88.0780, aux.loss_ce: 0.1424, aux.acc_seg: 86.1424, loss: 0.4485, grad_norm: 3.5481
2023-02-19 05:57:23,796 - mmseg - INFO - Iter [23850/160000]	lr: 5.106e-05, eta: 10:48:16, time: 0.296, data_time: 0.006, memory: 15214, decode.loss_ce: 0.3301, decode.acc_seg: 87.2614, aux.loss_ce: 0.1540, aux.acc_seg: 85.2150, loss: 0.4841, grad_norm: 4.7353
2023-02-19 05:57:37,881 - mmseg - INFO - Iter [23900/160000]	lr: 5.104e-05, eta: 10:48:01, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3132, decode.acc_seg: 87.9212, aux.loss_ce: 0.1459, aux.acc_seg: 85.8441, loss: 0.4591, grad_norm: 3.6166
2023-02-19 05:57:51,882 - mmseg - INFO - Iter [23950/160000]	lr: 5.102e-05, eta: 10:47:45, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3007, decode.acc_seg: 88.3327, aux.loss_ce: 0.1420, aux.acc_seg: 86.0590, loss: 0.4427, grad_norm: 3.9301
2023-02-19 05:58:08,275 - mmseg - INFO - Saving checkpoint at 24000 iterations
2023-02-19 05:58:11,571 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 05:58:11,572 - mmseg - INFO - Iter [24000/160000]	lr: 5.100e-05, eta: 10:48:02, time: 0.394, data_time: 0.048, memory: 15214, decode.loss_ce: 0.3207, decode.acc_seg: 88.1425, aux.loss_ce: 0.1485, aux.acc_seg: 85.9146, loss: 0.4693, grad_norm: 3.6575
2023-02-19 05:58:25,290 - mmseg - INFO - Iter [24050/160000]	lr: 5.098e-05, eta: 10:47:44, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2859, decode.acc_seg: 88.4377, aux.loss_ce: 0.1328, aux.acc_seg: 86.8447, loss: 0.4187, grad_norm: 3.3398
2023-02-19 05:58:39,475 - mmseg - INFO - Iter [24100/160000]	lr: 5.096e-05, eta: 10:47:29, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3023, decode.acc_seg: 88.3060, aux.loss_ce: 0.1428, aux.acc_seg: 86.3558, loss: 0.4452, grad_norm: 3.5515
2023-02-19 05:58:53,117 - mmseg - INFO - Iter [24150/160000]	lr: 5.094e-05, eta: 10:47:11, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3302, decode.acc_seg: 87.5054, aux.loss_ce: 0.1550, aux.acc_seg: 85.1499, loss: 0.4852, grad_norm: 4.8471
2023-02-19 05:59:07,226 - mmseg - INFO - Iter [24200/160000]	lr: 5.093e-05, eta: 10:46:56, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3098, decode.acc_seg: 87.8498, aux.loss_ce: 0.1447, aux.acc_seg: 85.9371, loss: 0.4545, grad_norm: 3.7987
2023-02-19 05:59:21,603 - mmseg - INFO - Iter [24250/160000]	lr: 5.091e-05, eta: 10:46:42, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2923, decode.acc_seg: 88.1791, aux.loss_ce: 0.1375, aux.acc_seg: 86.6144, loss: 0.4298, grad_norm: 3.7657
2023-02-19 05:59:36,275 - mmseg - INFO - Iter [24300/160000]	lr: 5.089e-05, eta: 10:46:30, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2952, decode.acc_seg: 88.4871, aux.loss_ce: 0.1405, aux.acc_seg: 86.4297, loss: 0.4357, grad_norm: 3.6975
2023-02-19 05:59:50,300 - mmseg - INFO - Iter [24350/160000]	lr: 5.087e-05, eta: 10:46:14, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2936, decode.acc_seg: 88.3877, aux.loss_ce: 0.1400, aux.acc_seg: 86.3594, loss: 0.4336, grad_norm: 4.8240
2023-02-19 06:00:04,108 - mmseg - INFO - Iter [24400/160000]	lr: 5.085e-05, eta: 10:45:57, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3044, decode.acc_seg: 88.0245, aux.loss_ce: 0.1436, aux.acc_seg: 86.1641, loss: 0.4480, grad_norm: 4.0236
2023-02-19 06:00:17,932 - mmseg - INFO - Iter [24450/160000]	lr: 5.083e-05, eta: 10:45:40, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3116, decode.acc_seg: 87.9263, aux.loss_ce: 0.1482, aux.acc_seg: 85.8498, loss: 0.4599, grad_norm: 4.3940
2023-02-19 06:00:31,730 - mmseg - INFO - Iter [24500/160000]	lr: 5.081e-05, eta: 10:45:23, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3110, decode.acc_seg: 87.9331, aux.loss_ce: 0.1445, aux.acc_seg: 85.8577, loss: 0.4556, grad_norm: 4.4103
2023-02-19 06:00:45,248 - mmseg - INFO - Iter [24550/160000]	lr: 5.079e-05, eta: 10:45:05, time: 0.270, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3319, decode.acc_seg: 87.2575, aux.loss_ce: 0.1586, aux.acc_seg: 84.9233, loss: 0.4905, grad_norm: 4.0313
2023-02-19 06:00:59,060 - mmseg - INFO - Iter [24600/160000]	lr: 5.078e-05, eta: 10:44:48, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2889, decode.acc_seg: 88.4357, aux.loss_ce: 0.1388, aux.acc_seg: 86.4221, loss: 0.4277, grad_norm: 3.5002
2023-02-19 06:01:12,589 - mmseg - INFO - Iter [24650/160000]	lr: 5.076e-05, eta: 10:44:29, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2930, decode.acc_seg: 88.6086, aux.loss_ce: 0.1388, aux.acc_seg: 86.6144, loss: 0.4318, grad_norm: 3.7660
2023-02-19 06:01:27,387 - mmseg - INFO - Iter [24700/160000]	lr: 5.074e-05, eta: 10:44:18, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3025, decode.acc_seg: 88.3812, aux.loss_ce: 0.1442, aux.acc_seg: 86.1097, loss: 0.4467, grad_norm: 4.2992
2023-02-19 06:01:42,162 - mmseg - INFO - Iter [24750/160000]	lr: 5.072e-05, eta: 10:44:06, time: 0.296, data_time: 0.006, memory: 15214, decode.loss_ce: 0.3256, decode.acc_seg: 87.4094, aux.loss_ce: 0.1484, aux.acc_seg: 85.6695, loss: 0.4740, grad_norm: 3.9352
2023-02-19 06:01:56,102 - mmseg - INFO - Iter [24800/160000]	lr: 5.070e-05, eta: 10:43:50, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3087, decode.acc_seg: 87.8698, aux.loss_ce: 0.1476, aux.acc_seg: 85.7039, loss: 0.4564, grad_norm: 3.9807
2023-02-19 06:02:10,591 - mmseg - INFO - Iter [24850/160000]	lr: 5.068e-05, eta: 10:43:37, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3174, decode.acc_seg: 87.6548, aux.loss_ce: 0.1541, aux.acc_seg: 85.2968, loss: 0.4715, grad_norm: 3.9139
2023-02-19 06:02:24,388 - mmseg - INFO - Iter [24900/160000]	lr: 5.066e-05, eta: 10:43:20, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3220, decode.acc_seg: 87.5416, aux.loss_ce: 0.1511, aux.acc_seg: 85.4541, loss: 0.4731, grad_norm: 4.4203
2023-02-19 06:02:38,137 - mmseg - INFO - Iter [24950/160000]	lr: 5.064e-05, eta: 10:43:02, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3265, decode.acc_seg: 87.3298, aux.loss_ce: 0.1520, aux.acc_seg: 85.5537, loss: 0.4785, grad_norm: 4.5813
2023-02-19 06:02:52,181 - mmseg - INFO - Saving checkpoint at 25000 iterations
2023-02-19 06:02:55,433 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:02:55,434 - mmseg - INFO - Iter [25000/160000]	lr: 5.063e-05, eta: 10:43:05, time: 0.347, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3157, decode.acc_seg: 88.0816, aux.loss_ce: 0.1488, aux.acc_seg: 86.1129, loss: 0.4646, grad_norm: 4.4203
2023-02-19 06:03:09,308 - mmseg - INFO - Iter [25050/160000]	lr: 5.061e-05, eta: 10:42:48, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3093, decode.acc_seg: 88.1660, aux.loss_ce: 0.1472, aux.acc_seg: 86.1302, loss: 0.4565, grad_norm: 3.7306
2023-02-19 06:03:23,314 - mmseg - INFO - Iter [25100/160000]	lr: 5.059e-05, eta: 10:42:32, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3030, decode.acc_seg: 87.9831, aux.loss_ce: 0.1416, aux.acc_seg: 86.0139, loss: 0.4446, grad_norm: 3.7025
2023-02-19 06:03:37,581 - mmseg - INFO - Iter [25150/160000]	lr: 5.057e-05, eta: 10:42:18, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3019, decode.acc_seg: 87.8046, aux.loss_ce: 0.1431, aux.acc_seg: 85.8270, loss: 0.4450, grad_norm: 3.7704
2023-02-19 06:03:51,732 - mmseg - INFO - Iter [25200/160000]	lr: 5.055e-05, eta: 10:42:03, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3113, decode.acc_seg: 87.9702, aux.loss_ce: 0.1470, aux.acc_seg: 85.8082, loss: 0.4583, grad_norm: 3.7293
2023-02-19 06:04:05,617 - mmseg - INFO - Iter [25250/160000]	lr: 5.053e-05, eta: 10:41:46, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3097, decode.acc_seg: 87.7651, aux.loss_ce: 0.1475, aux.acc_seg: 85.7440, loss: 0.4572, grad_norm: 3.9901
2023-02-19 06:04:21,524 - mmseg - INFO - Iter [25300/160000]	lr: 5.051e-05, eta: 10:41:41, time: 0.318, data_time: 0.047, memory: 15214, decode.loss_ce: 0.3006, decode.acc_seg: 87.9443, aux.loss_ce: 0.1403, aux.acc_seg: 86.1472, loss: 0.4409, grad_norm: 3.2965
2023-02-19 06:04:35,321 - mmseg - INFO - Iter [25350/160000]	lr: 5.049e-05, eta: 10:41:24, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2684, decode.acc_seg: 89.3572, aux.loss_ce: 0.1278, aux.acc_seg: 87.4370, loss: 0.3963, grad_norm: 3.5099
2023-02-19 06:04:49,500 - mmseg - INFO - Iter [25400/160000]	lr: 5.048e-05, eta: 10:41:09, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2970, decode.acc_seg: 88.4248, aux.loss_ce: 0.1398, aux.acc_seg: 86.4532, loss: 0.4368, grad_norm: 4.5523
2023-02-19 06:05:03,855 - mmseg - INFO - Iter [25450/160000]	lr: 5.046e-05, eta: 10:40:55, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2993, decode.acc_seg: 88.1886, aux.loss_ce: 0.1409, aux.acc_seg: 86.3328, loss: 0.4402, grad_norm: 4.5599
2023-02-19 06:05:17,550 - mmseg - INFO - Iter [25500/160000]	lr: 5.044e-05, eta: 10:40:37, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3050, decode.acc_seg: 87.9360, aux.loss_ce: 0.1448, aux.acc_seg: 85.8164, loss: 0.4497, grad_norm: 3.5141
2023-02-19 06:05:31,364 - mmseg - INFO - Iter [25550/160000]	lr: 5.042e-05, eta: 10:40:21, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2881, decode.acc_seg: 88.7862, aux.loss_ce: 0.1339, aux.acc_seg: 86.8517, loss: 0.4220, grad_norm: 3.8232
2023-02-19 06:05:45,684 - mmseg - INFO - Iter [25600/160000]	lr: 5.040e-05, eta: 10:40:07, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2889, decode.acc_seg: 88.7917, aux.loss_ce: 0.1380, aux.acc_seg: 86.5806, loss: 0.4269, grad_norm: 4.5001
2023-02-19 06:05:59,440 - mmseg - INFO - Iter [25650/160000]	lr: 5.038e-05, eta: 10:39:49, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3030, decode.acc_seg: 88.3743, aux.loss_ce: 0.1402, aux.acc_seg: 86.5248, loss: 0.4432, grad_norm: 4.7498
2023-02-19 06:06:13,195 - mmseg - INFO - Iter [25700/160000]	lr: 5.036e-05, eta: 10:39:32, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2893, decode.acc_seg: 88.7168, aux.loss_ce: 0.1373, aux.acc_seg: 86.6750, loss: 0.4266, grad_norm: 4.1691
2023-02-19 06:06:26,941 - mmseg - INFO - Iter [25750/160000]	lr: 5.034e-05, eta: 10:39:15, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2856, decode.acc_seg: 89.0848, aux.loss_ce: 0.1351, aux.acc_seg: 87.0418, loss: 0.4207, grad_norm: 3.5475
2023-02-19 06:06:41,458 - mmseg - INFO - Iter [25800/160000]	lr: 5.033e-05, eta: 10:39:02, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2926, decode.acc_seg: 88.4842, aux.loss_ce: 0.1413, aux.acc_seg: 85.9254, loss: 0.4339, grad_norm: 4.4121
2023-02-19 06:06:55,614 - mmseg - INFO - Iter [25850/160000]	lr: 5.031e-05, eta: 10:38:47, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3203, decode.acc_seg: 87.8210, aux.loss_ce: 0.1496, aux.acc_seg: 86.0560, loss: 0.4699, grad_norm: 3.8490
2023-02-19 06:07:09,798 - mmseg - INFO - Iter [25900/160000]	lr: 5.029e-05, eta: 10:38:32, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2956, decode.acc_seg: 88.4383, aux.loss_ce: 0.1390, aux.acc_seg: 86.5584, loss: 0.4347, grad_norm: 3.5837
2023-02-19 06:07:23,697 - mmseg - INFO - Iter [25950/160000]	lr: 5.027e-05, eta: 10:38:16, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3158, decode.acc_seg: 87.7288, aux.loss_ce: 0.1507, aux.acc_seg: 85.5861, loss: 0.4664, grad_norm: 4.0793
2023-02-19 06:07:38,087 - mmseg - INFO - Saving checkpoint at 26000 iterations
2023-02-19 06:07:41,300 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:07:41,300 - mmseg - INFO - Iter [26000/160000]	lr: 5.025e-05, eta: 10:38:19, time: 0.352, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2778, decode.acc_seg: 88.7926, aux.loss_ce: 0.1351, aux.acc_seg: 86.5907, loss: 0.4129, grad_norm: 4.3799
2023-02-19 06:07:55,101 - mmseg - INFO - Iter [26050/160000]	lr: 5.023e-05, eta: 10:38:02, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3012, decode.acc_seg: 88.3608, aux.loss_ce: 0.1403, aux.acc_seg: 86.3345, loss: 0.4415, grad_norm: 3.8072
2023-02-19 06:08:09,523 - mmseg - INFO - Iter [26100/160000]	lr: 5.021e-05, eta: 10:37:49, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2826, decode.acc_seg: 89.1283, aux.loss_ce: 0.1370, aux.acc_seg: 86.6127, loss: 0.4196, grad_norm: 3.2030
2023-02-19 06:08:23,616 - mmseg - INFO - Iter [26150/160000]	lr: 5.019e-05, eta: 10:37:33, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3013, decode.acc_seg: 88.1395, aux.loss_ce: 0.1470, aux.acc_seg: 85.8520, loss: 0.4483, grad_norm: 3.9796
2023-02-19 06:08:37,440 - mmseg - INFO - Iter [26200/160000]	lr: 5.018e-05, eta: 10:37:16, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2912, decode.acc_seg: 88.3182, aux.loss_ce: 0.1393, aux.acc_seg: 86.1376, loss: 0.4305, grad_norm: 4.1489
2023-02-19 06:08:51,085 - mmseg - INFO - Iter [26250/160000]	lr: 5.016e-05, eta: 10:36:59, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3013, decode.acc_seg: 88.4798, aux.loss_ce: 0.1464, aux.acc_seg: 86.2601, loss: 0.4477, grad_norm: 3.8930
2023-02-19 06:09:04,933 - mmseg - INFO - Iter [26300/160000]	lr: 5.014e-05, eta: 10:36:43, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3105, decode.acc_seg: 87.8756, aux.loss_ce: 0.1459, aux.acc_seg: 85.8433, loss: 0.4564, grad_norm: 4.1407
2023-02-19 06:09:19,945 - mmseg - INFO - Iter [26350/160000]	lr: 5.012e-05, eta: 10:36:32, time: 0.300, data_time: 0.006, memory: 15214, decode.loss_ce: 0.3149, decode.acc_seg: 87.7150, aux.loss_ce: 0.1478, aux.acc_seg: 85.7108, loss: 0.4627, grad_norm: 3.5691
2023-02-19 06:09:33,552 - mmseg - INFO - Iter [26400/160000]	lr: 5.010e-05, eta: 10:36:14, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2904, decode.acc_seg: 88.2552, aux.loss_ce: 0.1397, aux.acc_seg: 86.2228, loss: 0.4302, grad_norm: 4.0880
2023-02-19 06:09:47,145 - mmseg - INFO - Iter [26450/160000]	lr: 5.008e-05, eta: 10:35:56, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2969, decode.acc_seg: 88.3555, aux.loss_ce: 0.1419, aux.acc_seg: 85.9747, loss: 0.4388, grad_norm: 4.5104
2023-02-19 06:10:00,828 - mmseg - INFO - Iter [26500/160000]	lr: 5.006e-05, eta: 10:35:39, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3166, decode.acc_seg: 87.9339, aux.loss_ce: 0.1491, aux.acc_seg: 85.8518, loss: 0.4656, grad_norm: 4.1346
2023-02-19 06:10:16,782 - mmseg - INFO - Iter [26550/160000]	lr: 5.004e-05, eta: 10:35:33, time: 0.319, data_time: 0.047, memory: 15214, decode.loss_ce: 0.3092, decode.acc_seg: 87.4456, aux.loss_ce: 0.1469, aux.acc_seg: 85.2640, loss: 0.4561, grad_norm: 3.5896
2023-02-19 06:10:30,744 - mmseg - INFO - Iter [26600/160000]	lr: 5.003e-05, eta: 10:35:17, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2979, decode.acc_seg: 88.6130, aux.loss_ce: 0.1394, aux.acc_seg: 86.5570, loss: 0.4372, grad_norm: 3.7974
2023-02-19 06:10:45,166 - mmseg - INFO - Iter [26650/160000]	lr: 5.001e-05, eta: 10:35:04, time: 0.289, data_time: 0.006, memory: 15214, decode.loss_ce: 0.2930, decode.acc_seg: 88.6138, aux.loss_ce: 0.1405, aux.acc_seg: 86.3217, loss: 0.4335, grad_norm: 3.5989
2023-02-19 06:10:58,859 - mmseg - INFO - Iter [26700/160000]	lr: 4.999e-05, eta: 10:34:46, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2772, decode.acc_seg: 88.9458, aux.loss_ce: 0.1338, aux.acc_seg: 86.7479, loss: 0.4110, grad_norm: 3.3371
2023-02-19 06:11:12,551 - mmseg - INFO - Iter [26750/160000]	lr: 4.997e-05, eta: 10:34:29, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2833, decode.acc_seg: 88.5585, aux.loss_ce: 0.1361, aux.acc_seg: 86.5006, loss: 0.4194, grad_norm: 3.5894
2023-02-19 06:11:26,109 - mmseg - INFO - Iter [26800/160000]	lr: 4.995e-05, eta: 10:34:11, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2932, decode.acc_seg: 88.5729, aux.loss_ce: 0.1393, aux.acc_seg: 86.5441, loss: 0.4325, grad_norm: 4.8982
2023-02-19 06:11:40,806 - mmseg - INFO - Iter [26850/160000]	lr: 4.993e-05, eta: 10:33:59, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2972, decode.acc_seg: 88.2493, aux.loss_ce: 0.1419, aux.acc_seg: 86.0793, loss: 0.4391, grad_norm: 4.0071
2023-02-19 06:11:55,029 - mmseg - INFO - Iter [26900/160000]	lr: 4.991e-05, eta: 10:33:44, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2831, decode.acc_seg: 89.0671, aux.loss_ce: 0.1342, aux.acc_seg: 87.1102, loss: 0.4173, grad_norm: 3.6686
2023-02-19 06:12:09,828 - mmseg - INFO - Iter [26950/160000]	lr: 4.989e-05, eta: 10:33:33, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2893, decode.acc_seg: 88.8218, aux.loss_ce: 0.1388, aux.acc_seg: 86.5889, loss: 0.4281, grad_norm: 3.5666
2023-02-19 06:12:23,378 - mmseg - INFO - Saving checkpoint at 27000 iterations
2023-02-19 06:12:26,686 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:12:26,687 - mmseg - INFO - Iter [27000/160000]	lr: 4.988e-05, eta: 10:33:31, time: 0.337, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2831, decode.acc_seg: 89.0525, aux.loss_ce: 0.1359, aux.acc_seg: 86.8835, loss: 0.4190, grad_norm: 3.5462
2023-02-19 06:12:40,435 - mmseg - INFO - Iter [27050/160000]	lr: 4.986e-05, eta: 10:33:14, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2957, decode.acc_seg: 88.4145, aux.loss_ce: 0.1393, aux.acc_seg: 86.4289, loss: 0.4350, grad_norm: 3.7980
2023-02-19 06:12:54,333 - mmseg - INFO - Iter [27100/160000]	lr: 4.984e-05, eta: 10:32:58, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2796, decode.acc_seg: 89.0991, aux.loss_ce: 0.1350, aux.acc_seg: 86.9038, loss: 0.4146, grad_norm: 3.5056
2023-02-19 06:13:08,638 - mmseg - INFO - Iter [27150/160000]	lr: 4.982e-05, eta: 10:32:44, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2960, decode.acc_seg: 88.4518, aux.loss_ce: 0.1404, aux.acc_seg: 86.5400, loss: 0.4364, grad_norm: 4.0953
2023-02-19 06:13:22,229 - mmseg - INFO - Iter [27200/160000]	lr: 4.980e-05, eta: 10:32:26, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2832, decode.acc_seg: 88.8538, aux.loss_ce: 0.1351, aux.acc_seg: 86.4959, loss: 0.4183, grad_norm: 4.3815
2023-02-19 06:13:35,942 - mmseg - INFO - Iter [27250/160000]	lr: 4.978e-05, eta: 10:32:09, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2833, decode.acc_seg: 88.9009, aux.loss_ce: 0.1331, aux.acc_seg: 86.8712, loss: 0.4164, grad_norm: 3.8848
2023-02-19 06:13:49,733 - mmseg - INFO - Iter [27300/160000]	lr: 4.976e-05, eta: 10:31:52, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2840, decode.acc_seg: 88.8929, aux.loss_ce: 0.1351, aux.acc_seg: 86.7854, loss: 0.4191, grad_norm: 3.2376
2023-02-19 06:14:03,528 - mmseg - INFO - Iter [27350/160000]	lr: 4.974e-05, eta: 10:31:36, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2848, decode.acc_seg: 88.3607, aux.loss_ce: 0.1365, aux.acc_seg: 86.1853, loss: 0.4212, grad_norm: 3.9507
2023-02-19 06:14:17,622 - mmseg - INFO - Iter [27400/160000]	lr: 4.973e-05, eta: 10:31:20, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2783, decode.acc_seg: 89.0755, aux.loss_ce: 0.1369, aux.acc_seg: 86.6197, loss: 0.4152, grad_norm: 3.3168
2023-02-19 06:14:31,441 - mmseg - INFO - Iter [27450/160000]	lr: 4.971e-05, eta: 10:31:04, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2880, decode.acc_seg: 88.5015, aux.loss_ce: 0.1390, aux.acc_seg: 86.4365, loss: 0.4271, grad_norm: 3.8787
2023-02-19 06:14:45,422 - mmseg - INFO - Iter [27500/160000]	lr: 4.969e-05, eta: 10:30:48, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3030, decode.acc_seg: 88.0929, aux.loss_ce: 0.1436, aux.acc_seg: 85.9068, loss: 0.4466, grad_norm: 4.1498
2023-02-19 06:14:59,097 - mmseg - INFO - Iter [27550/160000]	lr: 4.967e-05, eta: 10:30:31, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.3047, decode.acc_seg: 88.0037, aux.loss_ce: 0.1452, aux.acc_seg: 85.7311, loss: 0.4499, grad_norm: 3.8671
2023-02-19 06:15:12,752 - mmseg - INFO - Iter [27600/160000]	lr: 4.965e-05, eta: 10:30:14, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2865, decode.acc_seg: 88.7417, aux.loss_ce: 0.1344, aux.acc_seg: 86.8749, loss: 0.4209, grad_norm: 3.7684
2023-02-19 06:15:27,139 - mmseg - INFO - Iter [27650/160000]	lr: 4.963e-05, eta: 10:30:00, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3094, decode.acc_seg: 88.0424, aux.loss_ce: 0.1474, aux.acc_seg: 85.9324, loss: 0.4568, grad_norm: 4.1652
2023-02-19 06:15:40,806 - mmseg - INFO - Iter [27700/160000]	lr: 4.961e-05, eta: 10:29:43, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2943, decode.acc_seg: 88.8122, aux.loss_ce: 0.1409, aux.acc_seg: 86.3932, loss: 0.4352, grad_norm: 3.6238
2023-02-19 06:15:54,583 - mmseg - INFO - Iter [27750/160000]	lr: 4.959e-05, eta: 10:29:26, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2807, decode.acc_seg: 89.0620, aux.loss_ce: 0.1357, aux.acc_seg: 86.7892, loss: 0.4164, grad_norm: 4.5936
2023-02-19 06:16:11,207 - mmseg - INFO - Iter [27800/160000]	lr: 4.958e-05, eta: 10:29:23, time: 0.332, data_time: 0.047, memory: 15214, decode.loss_ce: 0.2961, decode.acc_seg: 88.8660, aux.loss_ce: 0.1416, aux.acc_seg: 86.6305, loss: 0.4377, grad_norm: 3.7080
2023-02-19 06:16:24,820 - mmseg - INFO - Iter [27850/160000]	lr: 4.956e-05, eta: 10:29:05, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2750, decode.acc_seg: 89.3920, aux.loss_ce: 0.1324, aux.acc_seg: 87.1301, loss: 0.4074, grad_norm: 3.7765
2023-02-19 06:16:38,481 - mmseg - INFO - Iter [27900/160000]	lr: 4.954e-05, eta: 10:28:48, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2750, decode.acc_seg: 89.1913, aux.loss_ce: 0.1351, aux.acc_seg: 86.6112, loss: 0.4101, grad_norm: 4.1223
2023-02-19 06:16:52,191 - mmseg - INFO - Iter [27950/160000]	lr: 4.952e-05, eta: 10:28:31, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2851, decode.acc_seg: 88.9472, aux.loss_ce: 0.1391, aux.acc_seg: 86.6088, loss: 0.4242, grad_norm: 4.5962
2023-02-19 06:17:06,366 - mmseg - INFO - Saving checkpoint at 28000 iterations
2023-02-19 06:17:09,611 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:17:09,611 - mmseg - INFO - Iter [28000/160000]	lr: 4.950e-05, eta: 10:28:32, time: 0.349, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2913, decode.acc_seg: 88.7753, aux.loss_ce: 0.1390, aux.acc_seg: 86.3915, loss: 0.4303, grad_norm: 3.5319
2023-02-19 06:17:24,205 - mmseg - INFO - Iter [28050/160000]	lr: 4.948e-05, eta: 10:28:19, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2783, decode.acc_seg: 89.3147, aux.loss_ce: 0.1334, aux.acc_seg: 87.2208, loss: 0.4116, grad_norm: 4.0345
2023-02-19 06:17:37,776 - mmseg - INFO - Iter [28100/160000]	lr: 4.946e-05, eta: 10:28:01, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2687, decode.acc_seg: 89.2642, aux.loss_ce: 0.1321, aux.acc_seg: 86.9332, loss: 0.4008, grad_norm: 3.7575
2023-02-19 06:17:51,648 - mmseg - INFO - Iter [28150/160000]	lr: 4.944e-05, eta: 10:27:45, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2647, decode.acc_seg: 89.7470, aux.loss_ce: 0.1303, aux.acc_seg: 87.6079, loss: 0.3950, grad_norm: 3.8443
2023-02-19 06:18:05,821 - mmseg - INFO - Iter [28200/160000]	lr: 4.943e-05, eta: 10:27:30, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.3006, decode.acc_seg: 88.3744, aux.loss_ce: 0.1404, aux.acc_seg: 86.3915, loss: 0.4410, grad_norm: 3.9087
2023-02-19 06:18:19,501 - mmseg - INFO - Iter [28250/160000]	lr: 4.941e-05, eta: 10:27:13, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2699, decode.acc_seg: 89.2174, aux.loss_ce: 0.1325, aux.acc_seg: 86.9073, loss: 0.4023, grad_norm: 3.3531
2023-02-19 06:18:33,569 - mmseg - INFO - Iter [28300/160000]	lr: 4.939e-05, eta: 10:26:58, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2664, decode.acc_seg: 89.7541, aux.loss_ce: 0.1286, aux.acc_seg: 87.5317, loss: 0.3950, grad_norm: 3.4393
2023-02-19 06:18:49,289 - mmseg - INFO - Iter [28350/160000]	lr: 4.937e-05, eta: 10:26:50, time: 0.314, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2937, decode.acc_seg: 88.8290, aux.loss_ce: 0.1440, aux.acc_seg: 86.4028, loss: 0.4377, grad_norm: 3.7870
2023-02-19 06:19:02,995 - mmseg - INFO - Iter [28400/160000]	lr: 4.935e-05, eta: 10:26:33, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2853, decode.acc_seg: 88.8855, aux.loss_ce: 0.1369, aux.acc_seg: 86.7683, loss: 0.4222, grad_norm: 3.6350
2023-02-19 06:19:16,770 - mmseg - INFO - Iter [28450/160000]	lr: 4.933e-05, eta: 10:26:17, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2962, decode.acc_seg: 88.2736, aux.loss_ce: 0.1403, aux.acc_seg: 86.2875, loss: 0.4365, grad_norm: 4.8781
2023-02-19 06:19:30,549 - mmseg - INFO - Iter [28500/160000]	lr: 4.931e-05, eta: 10:26:00, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2703, decode.acc_seg: 89.6141, aux.loss_ce: 0.1287, aux.acc_seg: 87.7289, loss: 0.3990, grad_norm: 3.1243
2023-02-19 06:19:44,411 - mmseg - INFO - Iter [28550/160000]	lr: 4.929e-05, eta: 10:25:44, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2863, decode.acc_seg: 88.8281, aux.loss_ce: 0.1359, aux.acc_seg: 86.6041, loss: 0.4223, grad_norm: 4.3486
2023-02-19 06:19:58,410 - mmseg - INFO - Iter [28600/160000]	lr: 4.928e-05, eta: 10:25:28, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2778, decode.acc_seg: 89.1071, aux.loss_ce: 0.1337, aux.acc_seg: 86.9553, loss: 0.4115, grad_norm: 3.9406
2023-02-19 06:20:12,113 - mmseg - INFO - Iter [28650/160000]	lr: 4.926e-05, eta: 10:25:11, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2900, decode.acc_seg: 88.8419, aux.loss_ce: 0.1447, aux.acc_seg: 86.2447, loss: 0.4347, grad_norm: 3.9294
2023-02-19 06:20:26,180 - mmseg - INFO - Iter [28700/160000]	lr: 4.924e-05, eta: 10:24:56, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2908, decode.acc_seg: 88.4720, aux.loss_ce: 0.1373, aux.acc_seg: 86.3171, loss: 0.4281, grad_norm: 3.8647
2023-02-19 06:20:40,354 - mmseg - INFO - Iter [28750/160000]	lr: 4.922e-05, eta: 10:24:41, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2827, decode.acc_seg: 89.0858, aux.loss_ce: 0.1372, aux.acc_seg: 86.9155, loss: 0.4199, grad_norm: 5.3422
2023-02-19 06:20:54,742 - mmseg - INFO - Iter [28800/160000]	lr: 4.920e-05, eta: 10:24:27, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2737, decode.acc_seg: 89.3437, aux.loss_ce: 0.1356, aux.acc_seg: 86.9223, loss: 0.4092, grad_norm: 4.0876
2023-02-19 06:21:08,537 - mmseg - INFO - Iter [28850/160000]	lr: 4.918e-05, eta: 10:24:11, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2786, decode.acc_seg: 89.0174, aux.loss_ce: 0.1345, aux.acc_seg: 86.8320, loss: 0.4131, grad_norm: 4.7310
2023-02-19 06:21:22,520 - mmseg - INFO - Iter [28900/160000]	lr: 4.916e-05, eta: 10:23:55, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2878, decode.acc_seg: 88.5598, aux.loss_ce: 0.1375, aux.acc_seg: 86.5587, loss: 0.4252, grad_norm: 4.9703
2023-02-19 06:21:36,581 - mmseg - INFO - Iter [28950/160000]	lr: 4.914e-05, eta: 10:23:40, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2808, decode.acc_seg: 88.7695, aux.loss_ce: 0.1309, aux.acc_seg: 87.0988, loss: 0.4116, grad_norm: 3.7733
2023-02-19 06:21:50,374 - mmseg - INFO - Saving checkpoint at 29000 iterations
2023-02-19 06:21:53,627 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:21:53,627 - mmseg - INFO - Iter [29000/160000]	lr: 4.913e-05, eta: 10:23:38, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2812, decode.acc_seg: 89.1460, aux.loss_ce: 0.1366, aux.acc_seg: 86.7821, loss: 0.4178, grad_norm: 4.6395
2023-02-19 06:22:09,935 - mmseg - INFO - Iter [29050/160000]	lr: 4.911e-05, eta: 10:23:33, time: 0.326, data_time: 0.048, memory: 15214, decode.loss_ce: 0.2806, decode.acc_seg: 88.9235, aux.loss_ce: 0.1403, aux.acc_seg: 86.5128, loss: 0.4208, grad_norm: 3.6037
2023-02-19 06:22:23,943 - mmseg - INFO - Iter [29100/160000]	lr: 4.909e-05, eta: 10:23:18, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2774, decode.acc_seg: 89.0264, aux.loss_ce: 0.1359, aux.acc_seg: 86.7172, loss: 0.4133, grad_norm: 4.4175
2023-02-19 06:22:37,860 - mmseg - INFO - Iter [29150/160000]	lr: 4.907e-05, eta: 10:23:02, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2756, decode.acc_seg: 89.2715, aux.loss_ce: 0.1302, aux.acc_seg: 87.4738, loss: 0.4058, grad_norm: 3.1063
2023-02-19 06:22:52,021 - mmseg - INFO - Iter [29200/160000]	lr: 4.905e-05, eta: 10:22:47, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2774, decode.acc_seg: 89.4357, aux.loss_ce: 0.1348, aux.acc_seg: 87.3260, loss: 0.4121, grad_norm: 4.4685
2023-02-19 06:23:06,293 - mmseg - INFO - Iter [29250/160000]	lr: 4.903e-05, eta: 10:22:33, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2764, decode.acc_seg: 89.2570, aux.loss_ce: 0.1324, aux.acc_seg: 87.0447, loss: 0.4088, grad_norm: 3.4987
2023-02-19 06:23:19,966 - mmseg - INFO - Iter [29300/160000]	lr: 4.901e-05, eta: 10:22:16, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2732, decode.acc_seg: 89.2095, aux.loss_ce: 0.1341, aux.acc_seg: 86.8732, loss: 0.4072, grad_norm: 3.8331
2023-02-19 06:23:33,972 - mmseg - INFO - Iter [29350/160000]	lr: 4.899e-05, eta: 10:22:00, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2727, decode.acc_seg: 89.3727, aux.loss_ce: 0.1327, aux.acc_seg: 87.0363, loss: 0.4054, grad_norm: 3.3466
2023-02-19 06:23:47,534 - mmseg - INFO - Iter [29400/160000]	lr: 4.898e-05, eta: 10:21:43, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2881, decode.acc_seg: 88.7113, aux.loss_ce: 0.1352, aux.acc_seg: 86.7984, loss: 0.4233, grad_norm: 3.3217
2023-02-19 06:24:01,339 - mmseg - INFO - Iter [29450/160000]	lr: 4.896e-05, eta: 10:21:26, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2575, decode.acc_seg: 89.9481, aux.loss_ce: 0.1243, aux.acc_seg: 87.9132, loss: 0.3818, grad_norm: 2.9345
2023-02-19 06:24:15,082 - mmseg - INFO - Iter [29500/160000]	lr: 4.894e-05, eta: 10:21:09, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2782, decode.acc_seg: 88.9822, aux.loss_ce: 0.1360, aux.acc_seg: 86.6363, loss: 0.4141, grad_norm: 3.8905
2023-02-19 06:24:29,810 - mmseg - INFO - Iter [29550/160000]	lr: 4.892e-05, eta: 10:20:57, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2642, decode.acc_seg: 89.7691, aux.loss_ce: 0.1260, aux.acc_seg: 87.8541, loss: 0.3902, grad_norm: 3.4588
2023-02-19 06:24:43,672 - mmseg - INFO - Iter [29600/160000]	lr: 4.890e-05, eta: 10:20:41, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2783, decode.acc_seg: 89.0382, aux.loss_ce: 0.1343, aux.acc_seg: 86.7800, loss: 0.4126, grad_norm: 3.1115
2023-02-19 06:24:57,747 - mmseg - INFO - Iter [29650/160000]	lr: 4.888e-05, eta: 10:20:26, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2780, decode.acc_seg: 89.0317, aux.loss_ce: 0.1369, aux.acc_seg: 86.5978, loss: 0.4149, grad_norm: 3.2303
2023-02-19 06:25:11,951 - mmseg - INFO - Iter [29700/160000]	lr: 4.886e-05, eta: 10:20:11, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2760, decode.acc_seg: 89.0636, aux.loss_ce: 0.1385, aux.acc_seg: 86.4567, loss: 0.4145, grad_norm: 3.8754
2023-02-19 06:25:25,801 - mmseg - INFO - Iter [29750/160000]	lr: 4.884e-05, eta: 10:19:55, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2916, decode.acc_seg: 88.7213, aux.loss_ce: 0.1402, aux.acc_seg: 86.4969, loss: 0.4317, grad_norm: 4.0802
2023-02-19 06:25:39,347 - mmseg - INFO - Iter [29800/160000]	lr: 4.883e-05, eta: 10:19:38, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2635, decode.acc_seg: 89.6040, aux.loss_ce: 0.1319, aux.acc_seg: 87.2241, loss: 0.3954, grad_norm: 3.8582
2023-02-19 06:25:53,147 - mmseg - INFO - Iter [29850/160000]	lr: 4.881e-05, eta: 10:19:21, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2604, decode.acc_seg: 89.6593, aux.loss_ce: 0.1280, aux.acc_seg: 87.5317, loss: 0.3884, grad_norm: 3.1203
2023-02-19 06:26:06,908 - mmseg - INFO - Iter [29900/160000]	lr: 4.879e-05, eta: 10:19:05, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2754, decode.acc_seg: 89.4932, aux.loss_ce: 0.1336, aux.acc_seg: 87.2328, loss: 0.4090, grad_norm: 6.6002
2023-02-19 06:26:22,004 - mmseg - INFO - Iter [29950/160000]	lr: 4.877e-05, eta: 10:18:54, time: 0.301, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2825, decode.acc_seg: 88.9965, aux.loss_ce: 0.1310, aux.acc_seg: 87.2391, loss: 0.4135, grad_norm: 4.3327
2023-02-19 06:26:35,661 - mmseg - INFO - Saving checkpoint at 30000 iterations
2023-02-19 06:26:38,896 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:26:38,896 - mmseg - INFO - Iter [30000/160000]	lr: 4.875e-05, eta: 10:18:51, time: 0.339, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2803, decode.acc_seg: 89.0265, aux.loss_ce: 0.1371, aux.acc_seg: 86.7632, loss: 0.4174, grad_norm: 7.1844
2023-02-19 06:26:52,902 - mmseg - INFO - Iter [30050/160000]	lr: 4.873e-05, eta: 10:18:35, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2836, decode.acc_seg: 88.8503, aux.loss_ce: 0.1375, aux.acc_seg: 86.4853, loss: 0.4211, grad_norm: 3.9527
2023-02-19 06:27:06,944 - mmseg - INFO - Iter [30100/160000]	lr: 4.871e-05, eta: 10:18:20, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2726, decode.acc_seg: 89.6596, aux.loss_ce: 0.1348, aux.acc_seg: 87.1358, loss: 0.4075, grad_norm: 3.4695
2023-02-19 06:27:20,625 - mmseg - INFO - Iter [30150/160000]	lr: 4.869e-05, eta: 10:18:03, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2753, decode.acc_seg: 89.3823, aux.loss_ce: 0.1352, aux.acc_seg: 86.7240, loss: 0.4105, grad_norm: 3.4195
2023-02-19 06:27:35,416 - mmseg - INFO - Iter [30200/160000]	lr: 4.868e-05, eta: 10:17:51, time: 0.296, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2696, decode.acc_seg: 89.6587, aux.loss_ce: 0.1328, aux.acc_seg: 87.2233, loss: 0.4024, grad_norm: 3.4776
2023-02-19 06:27:49,310 - mmseg - INFO - Iter [30250/160000]	lr: 4.866e-05, eta: 10:17:35, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2715, decode.acc_seg: 89.3439, aux.loss_ce: 0.1347, aux.acc_seg: 86.9911, loss: 0.4062, grad_norm: 3.0937
2023-02-19 06:28:02,942 - mmseg - INFO - Iter [30300/160000]	lr: 4.864e-05, eta: 10:17:18, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2659, decode.acc_seg: 89.4350, aux.loss_ce: 0.1304, aux.acc_seg: 87.2371, loss: 0.3963, grad_norm: 3.6488
2023-02-19 06:28:18,697 - mmseg - INFO - Iter [30350/160000]	lr: 4.862e-05, eta: 10:17:10, time: 0.315, data_time: 0.047, memory: 15214, decode.loss_ce: 0.2709, decode.acc_seg: 89.2659, aux.loss_ce: 0.1326, aux.acc_seg: 87.0962, loss: 0.4035, grad_norm: 3.1354
2023-02-19 06:28:32,269 - mmseg - INFO - Iter [30400/160000]	lr: 4.860e-05, eta: 10:16:53, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2962, decode.acc_seg: 88.9302, aux.loss_ce: 0.1417, aux.acc_seg: 86.5772, loss: 0.4379, grad_norm: 3.9433
2023-02-19 06:28:45,904 - mmseg - INFO - Iter [30450/160000]	lr: 4.858e-05, eta: 10:16:36, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2793, decode.acc_seg: 88.9653, aux.loss_ce: 0.1370, aux.acc_seg: 86.6456, loss: 0.4163, grad_norm: 4.2866
2023-02-19 06:29:00,111 - mmseg - INFO - Iter [30500/160000]	lr: 4.856e-05, eta: 10:16:21, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2499, decode.acc_seg: 90.2052, aux.loss_ce: 0.1231, aux.acc_seg: 87.9570, loss: 0.3729, grad_norm: 2.9247
2023-02-19 06:29:13,898 - mmseg - INFO - Iter [30550/160000]	lr: 4.854e-05, eta: 10:16:05, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2815, decode.acc_seg: 89.0428, aux.loss_ce: 0.1361, aux.acc_seg: 86.8748, loss: 0.4176, grad_norm: 4.3881
2023-02-19 06:29:27,696 - mmseg - INFO - Iter [30600/160000]	lr: 4.853e-05, eta: 10:15:49, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2590, decode.acc_seg: 89.5697, aux.loss_ce: 0.1293, aux.acc_seg: 87.2545, loss: 0.3883, grad_norm: 2.9569
2023-02-19 06:29:41,350 - mmseg - INFO - Iter [30650/160000]	lr: 4.851e-05, eta: 10:15:32, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2544, decode.acc_seg: 89.8592, aux.loss_ce: 0.1273, aux.acc_seg: 87.2974, loss: 0.3818, grad_norm: 3.4083
2023-02-19 06:29:55,062 - mmseg - INFO - Iter [30700/160000]	lr: 4.849e-05, eta: 10:15:15, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2855, decode.acc_seg: 89.2968, aux.loss_ce: 0.1380, aux.acc_seg: 86.7925, loss: 0.4235, grad_norm: 3.8695
2023-02-19 06:30:09,250 - mmseg - INFO - Iter [30750/160000]	lr: 4.847e-05, eta: 10:15:01, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2612, decode.acc_seg: 89.7762, aux.loss_ce: 0.1312, aux.acc_seg: 87.3533, loss: 0.3925, grad_norm: 3.8055
2023-02-19 06:30:23,142 - mmseg - INFO - Iter [30800/160000]	lr: 4.845e-05, eta: 10:14:45, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2803, decode.acc_seg: 89.5420, aux.loss_ce: 0.1372, aux.acc_seg: 87.2813, loss: 0.4175, grad_norm: 3.7196
2023-02-19 06:30:37,391 - mmseg - INFO - Iter [30850/160000]	lr: 4.843e-05, eta: 10:14:30, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2786, decode.acc_seg: 89.3463, aux.loss_ce: 0.1372, aux.acc_seg: 86.9140, loss: 0.4158, grad_norm: 4.2975
2023-02-19 06:30:52,124 - mmseg - INFO - Iter [30900/160000]	lr: 4.841e-05, eta: 10:14:18, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2570, decode.acc_seg: 89.8590, aux.loss_ce: 0.1304, aux.acc_seg: 87.2668, loss: 0.3874, grad_norm: 3.5412
2023-02-19 06:31:05,835 - mmseg - INFO - Iter [30950/160000]	lr: 4.839e-05, eta: 10:14:01, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2597, decode.acc_seg: 89.7809, aux.loss_ce: 0.1295, aux.acc_seg: 87.5283, loss: 0.3891, grad_norm: 2.9811
2023-02-19 06:31:19,800 - mmseg - INFO - Saving checkpoint at 31000 iterations
2023-02-19 06:31:23,111 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:31:23,111 - mmseg - INFO - Iter [31000/160000]	lr: 4.838e-05, eta: 10:14:00, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2835, decode.acc_seg: 88.9256, aux.loss_ce: 0.1343, aux.acc_seg: 86.9254, loss: 0.4178, grad_norm: 3.7036
2023-02-19 06:31:36,744 - mmseg - INFO - Iter [31050/160000]	lr: 4.836e-05, eta: 10:13:43, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2648, decode.acc_seg: 89.7891, aux.loss_ce: 0.1328, aux.acc_seg: 87.2435, loss: 0.3976, grad_norm: 3.1210
2023-02-19 06:31:50,386 - mmseg - INFO - Iter [31100/160000]	lr: 4.834e-05, eta: 10:13:26, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2803, decode.acc_seg: 89.0091, aux.loss_ce: 0.1375, aux.acc_seg: 86.6408, loss: 0.4178, grad_norm: 4.0463
2023-02-19 06:32:04,661 - mmseg - INFO - Iter [31150/160000]	lr: 4.832e-05, eta: 10:13:11, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2612, decode.acc_seg: 89.6269, aux.loss_ce: 0.1301, aux.acc_seg: 87.2448, loss: 0.3913, grad_norm: 2.9592
2023-02-19 06:32:18,330 - mmseg - INFO - Iter [31200/160000]	lr: 4.830e-05, eta: 10:12:55, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2851, decode.acc_seg: 89.0688, aux.loss_ce: 0.1380, aux.acc_seg: 86.6253, loss: 0.4231, grad_norm: 3.6947
2023-02-19 06:32:32,133 - mmseg - INFO - Iter [31250/160000]	lr: 4.828e-05, eta: 10:12:38, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2601, decode.acc_seg: 89.6203, aux.loss_ce: 0.1266, aux.acc_seg: 87.4183, loss: 0.3867, grad_norm: 2.8838
2023-02-19 06:32:46,288 - mmseg - INFO - Iter [31300/160000]	lr: 4.826e-05, eta: 10:12:24, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2786, decode.acc_seg: 89.1959, aux.loss_ce: 0.1388, aux.acc_seg: 86.6726, loss: 0.4174, grad_norm: 3.6160
2023-02-19 06:32:59,908 - mmseg - INFO - Iter [31350/160000]	lr: 4.824e-05, eta: 10:12:07, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2634, decode.acc_seg: 89.5470, aux.loss_ce: 0.1263, aux.acc_seg: 87.2177, loss: 0.3896, grad_norm: 3.6675
2023-02-19 06:33:14,193 - mmseg - INFO - Iter [31400/160000]	lr: 4.823e-05, eta: 10:11:52, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2728, decode.acc_seg: 88.9648, aux.loss_ce: 0.1375, aux.acc_seg: 86.5528, loss: 0.4103, grad_norm: 3.5576
2023-02-19 06:33:28,310 - mmseg - INFO - Iter [31450/160000]	lr: 4.821e-05, eta: 10:11:37, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2677, decode.acc_seg: 88.8553, aux.loss_ce: 0.1306, aux.acc_seg: 86.7185, loss: 0.3983, grad_norm: 5.8714
2023-02-19 06:33:42,556 - mmseg - INFO - Iter [31500/160000]	lr: 4.819e-05, eta: 10:11:23, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2512, decode.acc_seg: 90.1222, aux.loss_ce: 0.1248, aux.acc_seg: 87.5950, loss: 0.3761, grad_norm: 3.0701
2023-02-19 06:33:56,367 - mmseg - INFO - Iter [31550/160000]	lr: 4.817e-05, eta: 10:11:07, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2687, decode.acc_seg: 89.2754, aux.loss_ce: 0.1342, aux.acc_seg: 86.8215, loss: 0.4029, grad_norm: 3.1105
2023-02-19 06:34:12,438 - mmseg - INFO - Iter [31600/160000]	lr: 4.815e-05, eta: 10:11:00, time: 0.321, data_time: 0.047, memory: 15214, decode.loss_ce: 0.2633, decode.acc_seg: 89.6737, aux.loss_ce: 0.1292, aux.acc_seg: 87.2856, loss: 0.3925, grad_norm: 3.4742
2023-02-19 06:34:26,265 - mmseg - INFO - Iter [31650/160000]	lr: 4.813e-05, eta: 10:10:44, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2464, decode.acc_seg: 90.1661, aux.loss_ce: 0.1237, aux.acc_seg: 87.7703, loss: 0.3701, grad_norm: 3.1305
2023-02-19 06:34:40,704 - mmseg - INFO - Iter [31700/160000]	lr: 4.811e-05, eta: 10:10:30, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2528, decode.acc_seg: 90.3273, aux.loss_ce: 0.1297, aux.acc_seg: 87.5128, loss: 0.3825, grad_norm: 2.8652
2023-02-19 06:34:54,643 - mmseg - INFO - Iter [31750/160000]	lr: 4.809e-05, eta: 10:10:15, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2541, decode.acc_seg: 89.8995, aux.loss_ce: 0.1220, aux.acc_seg: 87.6543, loss: 0.3760, grad_norm: 3.7139
2023-02-19 06:35:08,173 - mmseg - INFO - Iter [31800/160000]	lr: 4.808e-05, eta: 10:09:57, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2650, decode.acc_seg: 89.9490, aux.loss_ce: 0.1306, aux.acc_seg: 87.5949, loss: 0.3955, grad_norm: 3.0714
2023-02-19 06:35:22,268 - mmseg - INFO - Iter [31850/160000]	lr: 4.806e-05, eta: 10:09:42, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2663, decode.acc_seg: 89.5553, aux.loss_ce: 0.1332, aux.acc_seg: 87.0084, loss: 0.3996, grad_norm: 4.1823
2023-02-19 06:35:36,094 - mmseg - INFO - Iter [31900/160000]	lr: 4.804e-05, eta: 10:09:26, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2457, decode.acc_seg: 90.2281, aux.loss_ce: 0.1224, aux.acc_seg: 88.0094, loss: 0.3681, grad_norm: 2.9282
2023-02-19 06:35:49,886 - mmseg - INFO - Iter [31950/160000]	lr: 4.802e-05, eta: 10:09:10, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2722, decode.acc_seg: 89.3153, aux.loss_ce: 0.1348, aux.acc_seg: 86.9298, loss: 0.4070, grad_norm: 3.5364
2023-02-19 06:36:03,777 - mmseg - INFO - Saving checkpoint at 32000 iterations
2023-02-19 06:36:06,996 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:36:06,996 - mmseg - INFO - Iter [32000/160000]	lr: 4.800e-05, eta: 10:09:07, time: 0.342, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2679, decode.acc_seg: 89.6084, aux.loss_ce: 0.1312, aux.acc_seg: 87.2375, loss: 0.3991, grad_norm: 3.5791
2023-02-19 06:36:21,575 - mmseg - INFO - per class results:
2023-02-19 06:36:21,581 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        | 76.24 | 82.37 |
|       building      |  81.8 | 89.36 |
|         sky         | 94.19 | 97.05 |
|        floor        | 81.04 | 91.23 |
|         tree        |  72.6 | 93.43 |
|       ceiling       | 84.01 | 92.65 |
|         road        | 82.64 | 87.67 |
|         bed         | 86.79 | 97.31 |
|      windowpane     | 59.87 | 72.87 |
|        grass        | 68.08 | 79.37 |
|       cabinet       | 58.45 | 69.26 |
|       sidewalk      |  65.7 | 87.22 |
|        person       | 80.32 | 91.39 |
|        earth        | 39.92 | 54.85 |
|         door        | 50.67 | 76.31 |
|        table        | 61.46 | 71.86 |
|       mountain      | 55.95 |  71.5 |
|        plant        | 47.89 | 54.51 |
|       curtain       | 73.99 | 89.41 |
|        chair        |  59.8 | 72.44 |
|         car         | 84.46 | 90.35 |
|        water        | 60.01 | 74.61 |
|       painting      | 72.19 | 86.14 |
|         sofa        | 70.15 | 85.58 |
|        shelf        |  39.1 | 54.15 |
|        house        | 46.81 |  68.0 |
|         sea         | 67.21 | 84.26 |
|        mirror       |  63.1 | 86.97 |
|         rug         | 65.33 | 82.43 |
|        field        | 32.55 | 53.73 |
|       armchair      | 48.53 | 71.69 |
|         seat        | 65.45 | 83.23 |
|        fence        | 41.53 |  64.2 |
|         desk        | 48.57 | 71.69 |
|         rock        | 46.38 | 60.06 |
|       wardrobe      |  48.2 | 79.09 |
|         lamp        |  59.5 | 72.64 |
|       bathtub       | 78.92 | 85.51 |
|       railing       | 32.17 | 43.11 |
|       cushion       | 55.41 | 71.02 |
|         base        | 32.48 | 65.33 |
|         box         | 25.52 | 30.33 |
|        column       | 50.41 | 63.96 |
|      signboard      | 33.69 | 50.88 |
|   chest of drawers  | 37.89 | 75.99 |
|       counter       | 30.63 | 57.59 |
|         sand        | 52.78 | 68.58 |
|         sink        | 66.35 | 83.23 |
|      skyscraper     |  68.1 | 88.33 |
|      fireplace      | 63.43 | 95.31 |
|     refrigerator    | 72.42 | 81.86 |
|      grandstand     | 47.22 | 66.53 |
|         path        | 18.71 | 28.73 |
|        stairs       | 32.42 | 43.89 |
|        runway       |  69.4 | 88.06 |
|         case        | 49.81 | 73.75 |
|      pool table     | 91.17 | 96.57 |
|        pillow       | 38.54 | 40.33 |
|     screen door     | 70.85 | 81.25 |
|       stairway      | 43.48 | 51.82 |
|        river        | 13.23 | 32.46 |
|        bridge       | 50.36 | 61.64 |
|       bookcase      | 32.21 | 48.56 |
|        blind        |  51.4 |  72.8 |
|     coffee table    | 57.89 | 84.39 |
|        toilet       | 85.09 | 91.64 |
|        flower       | 42.29 | 77.59 |
|         book        |  42.6 | 79.13 |
|         hill        |  4.84 |  5.33 |
|        bench        | 49.38 |  55.4 |
|      countertop     | 52.59 |  82.5 |
|        stove        | 73.01 |  78.3 |
|         palm        |  57.4 | 79.87 |
|    kitchen island   | 39.38 | 89.39 |
|       computer      | 76.27 | 88.38 |
|     swivel chair    | 34.24 | 39.29 |
|         boat        | 47.27 | 51.33 |
|         bar         |  32.5 | 45.28 |
|    arcade machine   | 48.76 | 53.08 |
|        hovel        | 14.33 | 16.86 |
|         bus         | 89.67 | 95.53 |
|        towel        | 57.73 | 87.28 |
|        light        | 51.73 | 64.22 |
|        truck        | 38.27 | 51.23 |
|        tower        | 28.03 | 52.81 |
|      chandelier     | 67.87 | 77.65 |
|        awning       | 26.25 | 29.64 |
|     streetlight     | 25.93 | 32.13 |
|        booth        | 37.12 | 43.34 |
| television receiver | 62.19 | 83.01 |
|       airplane      | 52.57 | 66.81 |
|      dirt track     |  5.52 | 68.21 |
|       apparel       | 11.81 | 14.35 |
|         pole        | 26.43 |  42.2 |
|         land        |  0.03 |  0.04 |
|      bannister      | 15.22 | 26.14 |
|      escalator      | 35.07 | 58.22 |
|       ottoman       | 42.19 | 53.55 |
|        bottle       | 36.04 | 73.81 |
|        buffet       | 46.12 | 54.43 |
|        poster       | 24.77 | 59.15 |
|        stage        |  18.8 | 31.02 |
|         van         | 41.78 | 70.05 |
|         ship        | 56.62 | 97.23 |
|       fountain      | 21.63 | 21.73 |
|    conveyer belt    | 73.42 | 94.09 |
|        canopy       | 36.91 | 48.81 |
|        washer       |  73.1 | 76.52 |
|      plaything      | 25.83 | 33.66 |
|    swimming pool    |  46.8 | 74.92 |
|        stool        | 37.53 |  48.6 |
|        barrel       | 44.22 |  86.1 |
|        basket       | 34.43 | 47.73 |
|      waterfall      | 52.55 | 58.95 |
|         tent        | 94.16 | 97.36 |
|         bag         |  20.4 |  26.7 |
|       minibike      | 65.89 | 88.14 |
|        cradle       | 74.74 | 97.51 |
|         oven        | 41.53 | 66.03 |
|         ball        | 50.69 | 73.37 |
|         food        | 52.41 | 57.04 |
|         step        | 15.66 | 28.19 |
|         tank        | 63.08 |  64.1 |
|      trade name     | 25.19 | 31.02 |
|      microwave      | 76.99 | 91.44 |
|         pot         |  38.7 | 43.44 |
|        animal       | 62.52 | 66.24 |
|       bicycle       |  54.7 | 70.77 |
|         lake        |  31.0 | 32.14 |
|      dishwasher     | 53.86 | 85.68 |
|        screen       | 57.92 | 92.76 |
|       blanket       |  7.59 |  8.55 |
|      sculpture      | 73.71 | 80.98 |
|         hood        | 57.52 | 72.81 |
|        sconce       | 36.04 | 58.56 |
|         vase        | 28.41 |  64.8 |
|    traffic light    | 25.32 |  28.5 |
|         tray        |  3.58 |  4.24 |
|        ashcan       | 46.65 |  60.1 |
|         fan         | 62.77 | 81.84 |
|         pier        |  23.3 | 48.72 |
|      crt screen     |  3.7  |  9.98 |
|        plate        |  55.0 | 78.09 |
|       monitor       |  1.64 |  1.66 |
|    bulletin board   | 36.46 | 43.03 |
|        shower       |  0.97 |  2.12 |
|       radiator      | 61.64 | 76.05 |
|        glass        | 14.63 | 16.32 |
|        clock        | 41.07 | 49.24 |
|         flag        | 64.87 | 72.91 |
+---------------------+-------+-------+
2023-02-19 06:36:21,581 - mmseg - INFO - Summary:
2023-02-19 06:36:21,581 - mmseg - INFO - 
+------+-------+-------+
| aAcc |  mIoU |  mAcc |
+------+-------+-------+
| 82.5 | 48.51 | 63.66 |
+------+-------+-------+
2023-02-19 06:36:24,762 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_32000.pth.
2023-02-19 06:36:24,762 - mmseg - INFO - Best mIoU is 0.4851 at 32000 iter.
2023-02-19 06:36:24,762 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:36:24,762 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8250, mIoU: 0.4851, mAcc: 0.6366, IoU.wall: 0.7624, IoU.building: 0.8180, IoU.sky: 0.9419, IoU.floor: 0.8104, IoU.tree: 0.7260, IoU.ceiling: 0.8401, IoU.road: 0.8264, IoU.bed : 0.8679, IoU.windowpane: 0.5987, IoU.grass: 0.6808, IoU.cabinet: 0.5845, IoU.sidewalk: 0.6570, IoU.person: 0.8032, IoU.earth: 0.3992, IoU.door: 0.5067, IoU.table: 0.6146, IoU.mountain: 0.5595, IoU.plant: 0.4789, IoU.curtain: 0.7399, IoU.chair: 0.5980, IoU.car: 0.8446, IoU.water: 0.6001, IoU.painting: 0.7219, IoU.sofa: 0.7015, IoU.shelf: 0.3910, IoU.house: 0.4681, IoU.sea: 0.6721, IoU.mirror: 0.6310, IoU.rug: 0.6533, IoU.field: 0.3255, IoU.armchair: 0.4853, IoU.seat: 0.6545, IoU.fence: 0.4153, IoU.desk: 0.4857, IoU.rock: 0.4638, IoU.wardrobe: 0.4820, IoU.lamp: 0.5950, IoU.bathtub: 0.7892, IoU.railing: 0.3217, IoU.cushion: 0.5541, IoU.base: 0.3248, IoU.box: 0.2552, IoU.column: 0.5041, IoU.signboard: 0.3369, IoU.chest of drawers: 0.3789, IoU.counter: 0.3063, IoU.sand: 0.5278, IoU.sink: 0.6635, IoU.skyscraper: 0.6810, IoU.fireplace: 0.6343, IoU.refrigerator: 0.7242, IoU.grandstand: 0.4722, IoU.path: 0.1871, IoU.stairs: 0.3242, IoU.runway: 0.6940, IoU.case: 0.4981, IoU.pool table: 0.9117, IoU.pillow: 0.3854, IoU.screen door: 0.7085, IoU.stairway: 0.4348, IoU.river: 0.1323, IoU.bridge: 0.5036, IoU.bookcase: 0.3221, IoU.blind: 0.5140, IoU.coffee table: 0.5789, IoU.toilet: 0.8509, IoU.flower: 0.4229, IoU.book: 0.4260, IoU.hill: 0.0484, IoU.bench: 0.4938, IoU.countertop: 0.5259, IoU.stove: 0.7301, IoU.palm: 0.5740, IoU.kitchen island: 0.3938, IoU.computer: 0.7627, IoU.swivel chair: 0.3424, IoU.boat: 0.4727, IoU.bar: 0.3250, IoU.arcade machine: 0.4876, IoU.hovel: 0.1433, IoU.bus: 0.8967, IoU.towel: 0.5773, IoU.light: 0.5173, IoU.truck: 0.3827, IoU.tower: 0.2803, IoU.chandelier: 0.6787, IoU.awning: 0.2625, IoU.streetlight: 0.2593, IoU.booth: 0.3712, IoU.television receiver: 0.6219, IoU.airplane: 0.5257, IoU.dirt track: 0.0552, IoU.apparel: 0.1181, IoU.pole: 0.2643, IoU.land: 0.0003, IoU.bannister: 0.1522, IoU.escalator: 0.3507, IoU.ottoman: 0.4219, IoU.bottle: 0.3604, IoU.buffet: 0.4612, IoU.poster: 0.2477, IoU.stage: 0.1880, IoU.van: 0.4178, IoU.ship: 0.5662, IoU.fountain: 0.2163, IoU.conveyer belt: 0.7342, IoU.canopy: 0.3691, IoU.washer: 0.7310, IoU.plaything: 0.2583, IoU.swimming pool: 0.4680, IoU.stool: 0.3753, IoU.barrel: 0.4422, IoU.basket: 0.3443, IoU.waterfall: 0.5255, IoU.tent: 0.9416, IoU.bag: 0.2040, IoU.minibike: 0.6589, IoU.cradle: 0.7474, IoU.oven: 0.4153, IoU.ball: 0.5069, IoU.food: 0.5241, IoU.step: 0.1566, IoU.tank: 0.6308, IoU.trade name: 0.2519, IoU.microwave: 0.7699, IoU.pot: 0.3870, IoU.animal: 0.6252, IoU.bicycle: 0.5470, IoU.lake: 0.3100, IoU.dishwasher: 0.5386, IoU.screen: 0.5792, IoU.blanket: 0.0759, IoU.sculpture: 0.7371, IoU.hood: 0.5752, IoU.sconce: 0.3604, IoU.vase: 0.2841, IoU.traffic light: 0.2532, IoU.tray: 0.0358, IoU.ashcan: 0.4665, IoU.fan: 0.6277, IoU.pier: 0.2330, IoU.crt screen: 0.0370, IoU.plate: 0.5500, IoU.monitor: 0.0164, IoU.bulletin board: 0.3646, IoU.shower: 0.0097, IoU.radiator: 0.6164, IoU.glass: 0.1463, IoU.clock: 0.4107, IoU.flag: 0.6487, Acc.wall: 0.8237, Acc.building: 0.8936, Acc.sky: 0.9705, Acc.floor: 0.9123, Acc.tree: 0.9343, Acc.ceiling: 0.9265, Acc.road: 0.8767, Acc.bed : 0.9731, Acc.windowpane: 0.7287, Acc.grass: 0.7937, Acc.cabinet: 0.6926, Acc.sidewalk: 0.8722, Acc.person: 0.9139, Acc.earth: 0.5485, Acc.door: 0.7631, Acc.table: 0.7186, Acc.mountain: 0.7150, Acc.plant: 0.5451, Acc.curtain: 0.8941, Acc.chair: 0.7244, Acc.car: 0.9035, Acc.water: 0.7461, Acc.painting: 0.8614, Acc.sofa: 0.8558, Acc.shelf: 0.5415, Acc.house: 0.6800, Acc.sea: 0.8426, Acc.mirror: 0.8697, Acc.rug: 0.8243, Acc.field: 0.5373, Acc.armchair: 0.7169, Acc.seat: 0.8323, Acc.fence: 0.6420, Acc.desk: 0.7169, Acc.rock: 0.6006, Acc.wardrobe: 0.7909, Acc.lamp: 0.7264, Acc.bathtub: 0.8551, Acc.railing: 0.4311, Acc.cushion: 0.7102, Acc.base: 0.6533, Acc.box: 0.3033, Acc.column: 0.6396, Acc.signboard: 0.5088, Acc.chest of drawers: 0.7599, Acc.counter: 0.5759, Acc.sand: 0.6858, Acc.sink: 0.8323, Acc.skyscraper: 0.8833, Acc.fireplace: 0.9531, Acc.refrigerator: 0.8186, Acc.grandstand: 0.6653, Acc.path: 0.2873, Acc.stairs: 0.4389, Acc.runway: 0.8806, Acc.case: 0.7375, Acc.pool table: 0.9657, Acc.pillow: 0.4033, Acc.screen door: 0.8125, Acc.stairway: 0.5182, Acc.river: 0.3246, Acc.bridge: 0.6164, Acc.bookcase: 0.4856, Acc.blind: 0.7280, Acc.coffee table: 0.8439, Acc.toilet: 0.9164, Acc.flower: 0.7759, Acc.book: 0.7913, Acc.hill: 0.0533, Acc.bench: 0.5540, Acc.countertop: 0.8250, Acc.stove: 0.7830, Acc.palm: 0.7987, Acc.kitchen island: 0.8939, Acc.computer: 0.8838, Acc.swivel chair: 0.3929, Acc.boat: 0.5133, Acc.bar: 0.4528, Acc.arcade machine: 0.5308, Acc.hovel: 0.1686, Acc.bus: 0.9553, Acc.towel: 0.8728, Acc.light: 0.6422, Acc.truck: 0.5123, Acc.tower: 0.5281, Acc.chandelier: 0.7765, Acc.awning: 0.2964, Acc.streetlight: 0.3213, Acc.booth: 0.4334, Acc.television receiver: 0.8301, Acc.airplane: 0.6681, Acc.dirt track: 0.6821, Acc.apparel: 0.1435, Acc.pole: 0.4220, Acc.land: 0.0004, Acc.bannister: 0.2614, Acc.escalator: 0.5822, Acc.ottoman: 0.5355, Acc.bottle: 0.7381, Acc.buffet: 0.5443, Acc.poster: 0.5915, Acc.stage: 0.3102, Acc.van: 0.7005, Acc.ship: 0.9723, Acc.fountain: 0.2173, Acc.conveyer belt: 0.9409, Acc.canopy: 0.4881, Acc.washer: 0.7652, Acc.plaything: 0.3366, Acc.swimming pool: 0.7492, Acc.stool: 0.4860, Acc.barrel: 0.8610, Acc.basket: 0.4773, Acc.waterfall: 0.5895, Acc.tent: 0.9736, Acc.bag: 0.2670, Acc.minibike: 0.8814, Acc.cradle: 0.9751, Acc.oven: 0.6603, Acc.ball: 0.7337, Acc.food: 0.5704, Acc.step: 0.2819, Acc.tank: 0.6410, Acc.trade name: 0.3102, Acc.microwave: 0.9144, Acc.pot: 0.4344, Acc.animal: 0.6624, Acc.bicycle: 0.7077, Acc.lake: 0.3214, Acc.dishwasher: 0.8568, Acc.screen: 0.9276, Acc.blanket: 0.0855, Acc.sculpture: 0.8098, Acc.hood: 0.7281, Acc.sconce: 0.5856, Acc.vase: 0.6480, Acc.traffic light: 0.2850, Acc.tray: 0.0424, Acc.ashcan: 0.6010, Acc.fan: 0.8184, Acc.pier: 0.4872, Acc.crt screen: 0.0998, Acc.plate: 0.7809, Acc.monitor: 0.0166, Acc.bulletin board: 0.4303, Acc.shower: 0.0212, Acc.radiator: 0.7605, Acc.glass: 0.1632, Acc.clock: 0.4924, Acc.flag: 0.7291
2023-02-19 06:36:38,405 - mmseg - INFO - Iter [32050/160000]	lr: 4.798e-05, eta: 10:10:01, time: 0.628, data_time: 0.360, memory: 15214, decode.loss_ce: 0.2622, decode.acc_seg: 89.7500, aux.loss_ce: 0.1297, aux.acc_seg: 87.4743, loss: 0.3919, grad_norm: 3.6083
2023-02-19 06:36:52,269 - mmseg - INFO - Iter [32100/160000]	lr: 4.796e-05, eta: 10:09:45, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2640, decode.acc_seg: 89.8944, aux.loss_ce: 0.1344, aux.acc_seg: 87.1176, loss: 0.3984, grad_norm: 4.8555
2023-02-19 06:37:06,515 - mmseg - INFO - Iter [32150/160000]	lr: 4.794e-05, eta: 10:09:31, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2650, decode.acc_seg: 89.7287, aux.loss_ce: 0.1309, aux.acc_seg: 87.4264, loss: 0.3959, grad_norm: 3.4310
2023-02-19 06:37:20,598 - mmseg - INFO - Iter [32200/160000]	lr: 4.793e-05, eta: 10:09:15, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2537, decode.acc_seg: 89.8627, aux.loss_ce: 0.1292, aux.acc_seg: 87.2843, loss: 0.3829, grad_norm: 2.9921
2023-02-19 06:37:34,379 - mmseg - INFO - Iter [32250/160000]	lr: 4.791e-05, eta: 10:08:59, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2597, decode.acc_seg: 89.6461, aux.loss_ce: 0.1239, aux.acc_seg: 87.7082, loss: 0.3836, grad_norm: 3.8426
2023-02-19 06:37:48,859 - mmseg - INFO - Iter [32300/160000]	lr: 4.789e-05, eta: 10:08:46, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2551, decode.acc_seg: 90.0360, aux.loss_ce: 0.1286, aux.acc_seg: 87.4115, loss: 0.3836, grad_norm: 3.2272
2023-02-19 06:38:02,796 - mmseg - INFO - Iter [32350/160000]	lr: 4.787e-05, eta: 10:08:30, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2683, decode.acc_seg: 89.6823, aux.loss_ce: 0.1316, aux.acc_seg: 87.2936, loss: 0.3999, grad_norm: 3.5064
2023-02-19 06:38:16,446 - mmseg - INFO - Iter [32400/160000]	lr: 4.785e-05, eta: 10:08:13, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2762, decode.acc_seg: 89.2613, aux.loss_ce: 0.1334, aux.acc_seg: 87.1208, loss: 0.4096, grad_norm: 4.5745
2023-02-19 06:38:30,436 - mmseg - INFO - Iter [32450/160000]	lr: 4.783e-05, eta: 10:07:57, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2511, decode.acc_seg: 89.9829, aux.loss_ce: 0.1250, aux.acc_seg: 87.5090, loss: 0.3761, grad_norm: 3.3664
2023-02-19 06:38:44,423 - mmseg - INFO - Iter [32500/160000]	lr: 4.781e-05, eta: 10:07:42, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2475, decode.acc_seg: 90.0314, aux.loss_ce: 0.1232, aux.acc_seg: 87.7434, loss: 0.3707, grad_norm: 3.7883
2023-02-19 06:38:58,236 - mmseg - INFO - Iter [32550/160000]	lr: 4.779e-05, eta: 10:07:26, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2699, decode.acc_seg: 89.8351, aux.loss_ce: 0.1334, aux.acc_seg: 87.2608, loss: 0.4034, grad_norm: 3.4303
2023-02-19 06:39:11,937 - mmseg - INFO - Iter [32600/160000]	lr: 4.778e-05, eta: 10:07:09, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2487, decode.acc_seg: 90.3694, aux.loss_ce: 0.1227, aux.acc_seg: 88.2105, loss: 0.3713, grad_norm: 2.7495
2023-02-19 06:39:26,152 - mmseg - INFO - Iter [32650/160000]	lr: 4.776e-05, eta: 10:06:54, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2626, decode.acc_seg: 89.9541, aux.loss_ce: 0.1327, aux.acc_seg: 87.4295, loss: 0.3953, grad_norm: 3.7488
2023-02-19 06:39:40,533 - mmseg - INFO - Iter [32700/160000]	lr: 4.774e-05, eta: 10:06:40, time: 0.287, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2653, decode.acc_seg: 89.5077, aux.loss_ce: 0.1323, aux.acc_seg: 87.0952, loss: 0.3977, grad_norm: 3.9836
2023-02-19 06:39:54,391 - mmseg - INFO - Iter [32750/160000]	lr: 4.772e-05, eta: 10:06:24, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2878, decode.acc_seg: 88.5070, aux.loss_ce: 0.1411, aux.acc_seg: 85.9391, loss: 0.4289, grad_norm: 4.0547
2023-02-19 06:40:08,316 - mmseg - INFO - Iter [32800/160000]	lr: 4.770e-05, eta: 10:06:09, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2411, decode.acc_seg: 90.5619, aux.loss_ce: 0.1217, aux.acc_seg: 88.1775, loss: 0.3628, grad_norm: 3.6502
2023-02-19 06:40:24,394 - mmseg - INFO - Iter [32850/160000]	lr: 4.768e-05, eta: 10:06:01, time: 0.322, data_time: 0.048, memory: 15214, decode.loss_ce: 0.2517, decode.acc_seg: 90.0129, aux.loss_ce: 0.1256, aux.acc_seg: 87.6036, loss: 0.3773, grad_norm: 3.0687
2023-02-19 06:40:38,188 - mmseg - INFO - Iter [32900/160000]	lr: 4.766e-05, eta: 10:05:45, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2474, decode.acc_seg: 89.9707, aux.loss_ce: 0.1257, aux.acc_seg: 87.4124, loss: 0.3731, grad_norm: 3.1222
2023-02-19 06:40:52,122 - mmseg - INFO - Iter [32950/160000]	lr: 4.764e-05, eta: 10:05:29, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2462, decode.acc_seg: 90.2152, aux.loss_ce: 0.1224, aux.acc_seg: 87.7647, loss: 0.3687, grad_norm: 3.2189
2023-02-19 06:41:06,188 - mmseg - INFO - Saving checkpoint at 33000 iterations
2023-02-19 06:41:09,400 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:41:09,400 - mmseg - INFO - Iter [33000/160000]	lr: 4.763e-05, eta: 10:05:27, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2520, decode.acc_seg: 90.3391, aux.loss_ce: 0.1254, aux.acc_seg: 88.0273, loss: 0.3774, grad_norm: 3.7583
2023-02-19 06:41:23,944 - mmseg - INFO - Iter [33050/160000]	lr: 4.761e-05, eta: 10:05:13, time: 0.291, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2486, decode.acc_seg: 90.2922, aux.loss_ce: 0.1259, aux.acc_seg: 87.6426, loss: 0.3745, grad_norm: 2.9499
2023-02-19 06:41:37,463 - mmseg - INFO - Iter [33100/160000]	lr: 4.759e-05, eta: 10:04:56, time: 0.270, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2566, decode.acc_seg: 89.8882, aux.loss_ce: 0.1281, aux.acc_seg: 87.4481, loss: 0.3846, grad_norm: 3.9744
2023-02-19 06:41:51,977 - mmseg - INFO - Iter [33150/160000]	lr: 4.757e-05, eta: 10:04:42, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2418, decode.acc_seg: 90.4881, aux.loss_ce: 0.1206, aux.acc_seg: 88.1784, loss: 0.3624, grad_norm: 2.9482
2023-02-19 06:42:06,214 - mmseg - INFO - Iter [33200/160000]	lr: 4.755e-05, eta: 10:04:28, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2576, decode.acc_seg: 89.8692, aux.loss_ce: 0.1288, aux.acc_seg: 87.2003, loss: 0.3864, grad_norm: 4.4701
2023-02-19 06:42:20,654 - mmseg - INFO - Iter [33250/160000]	lr: 4.753e-05, eta: 10:04:14, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2711, decode.acc_seg: 89.3141, aux.loss_ce: 0.1345, aux.acc_seg: 86.9676, loss: 0.4055, grad_norm: 3.7341
2023-02-19 06:42:34,389 - mmseg - INFO - Iter [33300/160000]	lr: 4.751e-05, eta: 10:03:58, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2530, decode.acc_seg: 90.1570, aux.loss_ce: 0.1280, aux.acc_seg: 87.3825, loss: 0.3810, grad_norm: 2.9580
2023-02-19 06:42:48,074 - mmseg - INFO - Iter [33350/160000]	lr: 4.749e-05, eta: 10:03:41, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2593, decode.acc_seg: 90.1671, aux.loss_ce: 0.1319, aux.acc_seg: 87.5460, loss: 0.3912, grad_norm: 3.3800
2023-02-19 06:43:02,367 - mmseg - INFO - Iter [33400/160000]	lr: 4.748e-05, eta: 10:03:27, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2565, decode.acc_seg: 90.0054, aux.loss_ce: 0.1268, aux.acc_seg: 87.5965, loss: 0.3833, grad_norm: 3.1489
2023-02-19 06:43:16,457 - mmseg - INFO - Iter [33450/160000]	lr: 4.746e-05, eta: 10:03:12, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2587, decode.acc_seg: 89.9531, aux.loss_ce: 0.1272, aux.acc_seg: 87.6601, loss: 0.3859, grad_norm: 3.6094
2023-02-19 06:43:31,482 - mmseg - INFO - Iter [33500/160000]	lr: 4.744e-05, eta: 10:03:00, time: 0.301, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2612, decode.acc_seg: 90.0240, aux.loss_ce: 0.1278, aux.acc_seg: 87.4056, loss: 0.3891, grad_norm: 3.8435
2023-02-19 06:43:45,750 - mmseg - INFO - Iter [33550/160000]	lr: 4.742e-05, eta: 10:02:46, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2544, decode.acc_seg: 90.1356, aux.loss_ce: 0.1286, aux.acc_seg: 87.6913, loss: 0.3831, grad_norm: 3.2648
2023-02-19 06:43:59,435 - mmseg - INFO - Iter [33600/160000]	lr: 4.740e-05, eta: 10:02:29, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2415, decode.acc_seg: 90.5075, aux.loss_ce: 0.1240, aux.acc_seg: 87.8597, loss: 0.3654, grad_norm: 3.5757
2023-02-19 06:44:13,829 - mmseg - INFO - Iter [33650/160000]	lr: 4.738e-05, eta: 10:02:15, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2670, decode.acc_seg: 89.7913, aux.loss_ce: 0.1337, aux.acc_seg: 87.2214, loss: 0.4006, grad_norm: 3.8422
2023-02-19 06:44:27,826 - mmseg - INFO - Iter [33700/160000]	lr: 4.736e-05, eta: 10:02:00, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2613, decode.acc_seg: 89.5392, aux.loss_ce: 0.1310, aux.acc_seg: 87.1312, loss: 0.3923, grad_norm: 3.6354
2023-02-19 06:44:42,112 - mmseg - INFO - Iter [33750/160000]	lr: 4.734e-05, eta: 10:01:45, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2682, decode.acc_seg: 89.3771, aux.loss_ce: 0.1341, aux.acc_seg: 86.9012, loss: 0.4022, grad_norm: 3.7105
2023-02-19 06:44:55,962 - mmseg - INFO - Iter [33800/160000]	lr: 4.733e-05, eta: 10:01:29, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2498, decode.acc_seg: 90.2918, aux.loss_ce: 0.1266, aux.acc_seg: 87.5662, loss: 0.3764, grad_norm: 3.0575
2023-02-19 06:45:10,945 - mmseg - INFO - Iter [33850/160000]	lr: 4.731e-05, eta: 10:01:17, time: 0.300, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2565, decode.acc_seg: 90.0516, aux.loss_ce: 0.1249, aux.acc_seg: 87.7622, loss: 0.3814, grad_norm: 3.9200
2023-02-19 06:45:24,885 - mmseg - INFO - Iter [33900/160000]	lr: 4.729e-05, eta: 10:01:02, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2380, decode.acc_seg: 90.6607, aux.loss_ce: 0.1207, aux.acc_seg: 88.2402, loss: 0.3587, grad_norm: 3.0647
2023-02-19 06:45:38,890 - mmseg - INFO - Iter [33950/160000]	lr: 4.727e-05, eta: 10:00:46, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2463, decode.acc_seg: 90.3559, aux.loss_ce: 0.1236, aux.acc_seg: 87.9583, loss: 0.3699, grad_norm: 3.2128
2023-02-19 06:45:52,826 - mmseg - INFO - Saving checkpoint at 34000 iterations
2023-02-19 06:45:56,052 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:45:56,052 - mmseg - INFO - Iter [34000/160000]	lr: 4.725e-05, eta: 10:00:43, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2460, decode.acc_seg: 90.3177, aux.loss_ce: 0.1213, aux.acc_seg: 88.2546, loss: 0.3673, grad_norm: 2.9316
2023-02-19 06:46:10,361 - mmseg - INFO - Iter [34050/160000]	lr: 4.723e-05, eta: 10:00:28, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2485, decode.acc_seg: 89.9867, aux.loss_ce: 0.1243, aux.acc_seg: 87.6177, loss: 0.3728, grad_norm: 3.2889
2023-02-19 06:46:24,569 - mmseg - INFO - Iter [34100/160000]	lr: 4.721e-05, eta: 10:00:14, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2656, decode.acc_seg: 89.6187, aux.loss_ce: 0.1358, aux.acc_seg: 86.8277, loss: 0.4014, grad_norm: 3.7965
2023-02-19 06:46:40,679 - mmseg - INFO - Iter [34150/160000]	lr: 4.719e-05, eta: 10:00:06, time: 0.322, data_time: 0.047, memory: 15214, decode.loss_ce: 0.2315, decode.acc_seg: 90.6358, aux.loss_ce: 0.1196, aux.acc_seg: 87.9781, loss: 0.3511, grad_norm: 3.3081
2023-02-19 06:46:54,228 - mmseg - INFO - Iter [34200/160000]	lr: 4.718e-05, eta: 9:59:49, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2319, decode.acc_seg: 90.7112, aux.loss_ce: 0.1170, aux.acc_seg: 88.4838, loss: 0.3489, grad_norm: 3.3778
2023-02-19 06:47:08,726 - mmseg - INFO - Iter [34250/160000]	lr: 4.716e-05, eta: 9:59:35, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2615, decode.acc_seg: 89.8729, aux.loss_ce: 0.1298, aux.acc_seg: 87.3490, loss: 0.3913, grad_norm: 3.2230
2023-02-19 06:47:22,857 - mmseg - INFO - Iter [34300/160000]	lr: 4.714e-05, eta: 9:59:21, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2541, decode.acc_seg: 90.0571, aux.loss_ce: 0.1280, aux.acc_seg: 87.4831, loss: 0.3821, grad_norm: 3.0712
2023-02-19 06:47:36,461 - mmseg - INFO - Iter [34350/160000]	lr: 4.712e-05, eta: 9:59:04, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2616, decode.acc_seg: 89.5959, aux.loss_ce: 0.1298, aux.acc_seg: 87.1751, loss: 0.3914, grad_norm: 4.1826
2023-02-19 06:47:50,202 - mmseg - INFO - Iter [34400/160000]	lr: 4.710e-05, eta: 9:58:47, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2355, decode.acc_seg: 90.6886, aux.loss_ce: 0.1195, aux.acc_seg: 88.2865, loss: 0.3551, grad_norm: 2.8842
2023-02-19 06:48:03,837 - mmseg - INFO - Iter [34450/160000]	lr: 4.708e-05, eta: 9:58:31, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2473, decode.acc_seg: 90.1914, aux.loss_ce: 0.1283, aux.acc_seg: 87.2561, loss: 0.3756, grad_norm: 3.0384
2023-02-19 06:48:19,062 - mmseg - INFO - Iter [34500/160000]	lr: 4.706e-05, eta: 9:58:20, time: 0.304, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2483, decode.acc_seg: 90.4048, aux.loss_ce: 0.1270, aux.acc_seg: 87.9385, loss: 0.3753, grad_norm: 3.2875
2023-02-19 06:48:33,041 - mmseg - INFO - Iter [34550/160000]	lr: 4.704e-05, eta: 9:58:04, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2551, decode.acc_seg: 90.1786, aux.loss_ce: 0.1268, aux.acc_seg: 87.8372, loss: 0.3818, grad_norm: 3.7496
2023-02-19 06:48:46,991 - mmseg - INFO - Iter [34600/160000]	lr: 4.703e-05, eta: 9:57:49, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2517, decode.acc_seg: 90.2071, aux.loss_ce: 0.1270, aux.acc_seg: 87.5724, loss: 0.3787, grad_norm: 3.3711
2023-02-19 06:49:01,016 - mmseg - INFO - Iter [34650/160000]	lr: 4.701e-05, eta: 9:57:33, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2488, decode.acc_seg: 90.3922, aux.loss_ce: 0.1255, aux.acc_seg: 87.7507, loss: 0.3743, grad_norm: 2.8203
2023-02-19 06:49:15,511 - mmseg - INFO - Iter [34700/160000]	lr: 4.699e-05, eta: 9:57:20, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2484, decode.acc_seg: 90.1001, aux.loss_ce: 0.1252, aux.acc_seg: 87.6996, loss: 0.3736, grad_norm: 3.4683
2023-02-19 06:49:29,457 - mmseg - INFO - Iter [34750/160000]	lr: 4.697e-05, eta: 9:57:04, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2430, decode.acc_seg: 90.1941, aux.loss_ce: 0.1209, aux.acc_seg: 87.7975, loss: 0.3639, grad_norm: 3.5175
2023-02-19 06:49:43,037 - mmseg - INFO - Iter [34800/160000]	lr: 4.695e-05, eta: 9:56:47, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2441, decode.acc_seg: 90.4926, aux.loss_ce: 0.1219, aux.acc_seg: 88.1763, loss: 0.3660, grad_norm: 3.4576
2023-02-19 06:49:56,886 - mmseg - INFO - Iter [34850/160000]	lr: 4.693e-05, eta: 9:56:31, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2407, decode.acc_seg: 90.2616, aux.loss_ce: 0.1210, aux.acc_seg: 87.8655, loss: 0.3617, grad_norm: 2.9835
2023-02-19 06:50:10,491 - mmseg - INFO - Iter [34900/160000]	lr: 4.691e-05, eta: 9:56:14, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2603, decode.acc_seg: 89.9596, aux.loss_ce: 0.1295, aux.acc_seg: 87.5584, loss: 0.3898, grad_norm: 3.3148
2023-02-19 06:50:24,103 - mmseg - INFO - Iter [34950/160000]	lr: 4.689e-05, eta: 9:55:58, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2468, decode.acc_seg: 90.2420, aux.loss_ce: 0.1266, aux.acc_seg: 87.6535, loss: 0.3734, grad_norm: 3.0930
2023-02-19 06:50:37,718 - mmseg - INFO - Saving checkpoint at 35000 iterations
2023-02-19 06:50:41,032 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:50:41,032 - mmseg - INFO - Iter [35000/160000]	lr: 4.688e-05, eta: 9:55:53, time: 0.339, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2481, decode.acc_seg: 90.5045, aux.loss_ce: 0.1269, aux.acc_seg: 87.7639, loss: 0.3750, grad_norm: 3.4656
2023-02-19 06:50:54,966 - mmseg - INFO - Iter [35050/160000]	lr: 4.686e-05, eta: 9:55:37, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2549, decode.acc_seg: 90.1786, aux.loss_ce: 0.1300, aux.acc_seg: 87.4979, loss: 0.3850, grad_norm: 3.8453
2023-02-19 06:51:09,121 - mmseg - INFO - Iter [35100/160000]	lr: 4.684e-05, eta: 9:55:22, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2373, decode.acc_seg: 90.5328, aux.loss_ce: 0.1196, aux.acc_seg: 88.1473, loss: 0.3568, grad_norm: 3.0795
2023-02-19 06:51:22,717 - mmseg - INFO - Iter [35150/160000]	lr: 4.682e-05, eta: 9:55:06, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2560, decode.acc_seg: 89.8566, aux.loss_ce: 0.1253, aux.acc_seg: 87.4815, loss: 0.3813, grad_norm: 3.1482
2023-02-19 06:51:36,439 - mmseg - INFO - Iter [35200/160000]	lr: 4.680e-05, eta: 9:54:49, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2470, decode.acc_seg: 90.1190, aux.loss_ce: 0.1203, aux.acc_seg: 88.1552, loss: 0.3673, grad_norm: 3.0434
2023-02-19 06:51:50,137 - mmseg - INFO - Iter [35250/160000]	lr: 4.678e-05, eta: 9:54:33, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2432, decode.acc_seg: 90.3574, aux.loss_ce: 0.1200, aux.acc_seg: 88.1379, loss: 0.3632, grad_norm: 2.9965
2023-02-19 06:52:03,662 - mmseg - INFO - Iter [35300/160000]	lr: 4.676e-05, eta: 9:54:16, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2412, decode.acc_seg: 90.3825, aux.loss_ce: 0.1218, aux.acc_seg: 87.9932, loss: 0.3630, grad_norm: 2.9210
2023-02-19 06:52:17,704 - mmseg - INFO - Iter [35350/160000]	lr: 4.674e-05, eta: 9:54:01, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2615, decode.acc_seg: 89.6972, aux.loss_ce: 0.1288, aux.acc_seg: 86.9895, loss: 0.3903, grad_norm: 3.4002
2023-02-19 06:52:33,636 - mmseg - INFO - Iter [35400/160000]	lr: 4.673e-05, eta: 9:53:52, time: 0.318, data_time: 0.048, memory: 15214, decode.loss_ce: 0.2542, decode.acc_seg: 89.8566, aux.loss_ce: 0.1297, aux.acc_seg: 87.2154, loss: 0.3839, grad_norm: 3.3070
2023-02-19 06:52:47,232 - mmseg - INFO - Iter [35450/160000]	lr: 4.671e-05, eta: 9:53:35, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2378, decode.acc_seg: 90.5349, aux.loss_ce: 0.1236, aux.acc_seg: 87.7958, loss: 0.3614, grad_norm: 3.3607
2023-02-19 06:53:01,180 - mmseg - INFO - Iter [35500/160000]	lr: 4.669e-05, eta: 9:53:20, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2341, decode.acc_seg: 90.9298, aux.loss_ce: 0.1185, aux.acc_seg: 88.5035, loss: 0.3526, grad_norm: 2.9854
2023-02-19 06:53:15,343 - mmseg - INFO - Iter [35550/160000]	lr: 4.667e-05, eta: 9:53:05, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2391, decode.acc_seg: 90.5067, aux.loss_ce: 0.1216, aux.acc_seg: 88.1897, loss: 0.3607, grad_norm: 3.8615
2023-02-19 06:53:29,423 - mmseg - INFO - Iter [35600/160000]	lr: 4.665e-05, eta: 9:52:50, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2394, decode.acc_seg: 90.4207, aux.loss_ce: 0.1182, aux.acc_seg: 88.1665, loss: 0.3575, grad_norm: 3.9833
2023-02-19 06:53:42,975 - mmseg - INFO - Iter [35650/160000]	lr: 4.663e-05, eta: 9:52:33, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2292, decode.acc_seg: 90.8583, aux.loss_ce: 0.1176, aux.acc_seg: 88.4417, loss: 0.3468, grad_norm: 3.0182
2023-02-19 06:53:56,643 - mmseg - INFO - Iter [35700/160000]	lr: 4.661e-05, eta: 9:52:17, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2471, decode.acc_seg: 90.0768, aux.loss_ce: 0.1236, aux.acc_seg: 87.6144, loss: 0.3707, grad_norm: 3.4081
2023-02-19 06:54:10,561 - mmseg - INFO - Iter [35750/160000]	lr: 4.659e-05, eta: 9:52:01, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2268, decode.acc_seg: 91.1576, aux.loss_ce: 0.1166, aux.acc_seg: 88.6148, loss: 0.3435, grad_norm: 3.1133
2023-02-19 06:54:24,134 - mmseg - INFO - Iter [35800/160000]	lr: 4.658e-05, eta: 9:51:44, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2320, decode.acc_seg: 90.8414, aux.loss_ce: 0.1213, aux.acc_seg: 88.1213, loss: 0.3533, grad_norm: 3.2160
2023-02-19 06:54:38,333 - mmseg - INFO - Iter [35850/160000]	lr: 4.656e-05, eta: 9:51:30, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2486, decode.acc_seg: 90.2237, aux.loss_ce: 0.1252, aux.acc_seg: 87.7816, loss: 0.3738, grad_norm: 3.2829
2023-02-19 06:54:52,046 - mmseg - INFO - Iter [35900/160000]	lr: 4.654e-05, eta: 9:51:13, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2388, decode.acc_seg: 90.8252, aux.loss_ce: 0.1187, aux.acc_seg: 88.5625, loss: 0.3575, grad_norm: 3.9041
2023-02-19 06:55:06,051 - mmseg - INFO - Iter [35950/160000]	lr: 4.652e-05, eta: 9:50:58, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2539, decode.acc_seg: 89.8945, aux.loss_ce: 0.1284, aux.acc_seg: 87.2533, loss: 0.3823, grad_norm: 3.7726
2023-02-19 06:55:20,258 - mmseg - INFO - Saving checkpoint at 36000 iterations
2023-02-19 06:55:23,474 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 06:55:23,474 - mmseg - INFO - Iter [36000/160000]	lr: 4.650e-05, eta: 9:50:54, time: 0.349, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2474, decode.acc_seg: 90.5408, aux.loss_ce: 0.1201, aux.acc_seg: 88.3005, loss: 0.3675, grad_norm: 3.2055
2023-02-19 06:55:37,979 - mmseg - INFO - Iter [36050/160000]	lr: 4.648e-05, eta: 9:50:41, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2409, decode.acc_seg: 90.4099, aux.loss_ce: 0.1213, aux.acc_seg: 87.9702, loss: 0.3622, grad_norm: 3.5057
2023-02-19 06:55:52,447 - mmseg - INFO - Iter [36100/160000]	lr: 4.646e-05, eta: 9:50:27, time: 0.289, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2400, decode.acc_seg: 90.5048, aux.loss_ce: 0.1219, aux.acc_seg: 87.9772, loss: 0.3618, grad_norm: 3.1661
2023-02-19 06:56:07,098 - mmseg - INFO - Iter [36150/160000]	lr: 4.644e-05, eta: 9:50:14, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2330, decode.acc_seg: 90.6961, aux.loss_ce: 0.1174, aux.acc_seg: 88.4000, loss: 0.3504, grad_norm: 3.9434
2023-02-19 06:56:20,811 - mmseg - INFO - Iter [36200/160000]	lr: 4.643e-05, eta: 9:49:58, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2407, decode.acc_seg: 90.4012, aux.loss_ce: 0.1243, aux.acc_seg: 87.7094, loss: 0.3650, grad_norm: 3.3749
2023-02-19 06:56:35,243 - mmseg - INFO - Iter [36250/160000]	lr: 4.641e-05, eta: 9:49:44, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2462, decode.acc_seg: 90.3379, aux.loss_ce: 0.1235, aux.acc_seg: 87.8480, loss: 0.3696, grad_norm: 3.3755
2023-02-19 06:56:49,200 - mmseg - INFO - Iter [36300/160000]	lr: 4.639e-05, eta: 9:49:28, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2406, decode.acc_seg: 90.4812, aux.loss_ce: 0.1212, aux.acc_seg: 88.0993, loss: 0.3619, grad_norm: 3.0971
2023-02-19 06:57:03,359 - mmseg - INFO - Iter [36350/160000]	lr: 4.637e-05, eta: 9:49:14, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2270, decode.acc_seg: 91.0341, aux.loss_ce: 0.1183, aux.acc_seg: 88.4129, loss: 0.3453, grad_norm: 2.9249
2023-02-19 06:57:17,145 - mmseg - INFO - Iter [36400/160000]	lr: 4.635e-05, eta: 9:48:58, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2362, decode.acc_seg: 90.4646, aux.loss_ce: 0.1235, aux.acc_seg: 87.5540, loss: 0.3597, grad_norm: 3.3319
2023-02-19 06:57:30,716 - mmseg - INFO - Iter [36450/160000]	lr: 4.633e-05, eta: 9:48:41, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2387, decode.acc_seg: 90.6881, aux.loss_ce: 0.1208, aux.acc_seg: 88.1141, loss: 0.3595, grad_norm: 2.9467
2023-02-19 06:57:44,565 - mmseg - INFO - Iter [36500/160000]	lr: 4.631e-05, eta: 9:48:25, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2356, decode.acc_seg: 90.5658, aux.loss_ce: 0.1193, aux.acc_seg: 88.0407, loss: 0.3548, grad_norm: 2.9143
2023-02-19 06:57:58,493 - mmseg - INFO - Iter [36550/160000]	lr: 4.629e-05, eta: 9:48:10, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2410, decode.acc_seg: 90.6177, aux.loss_ce: 0.1242, aux.acc_seg: 87.8489, loss: 0.3652, grad_norm: 3.3194
2023-02-19 06:58:12,303 - mmseg - INFO - Iter [36600/160000]	lr: 4.628e-05, eta: 9:47:54, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2549, decode.acc_seg: 89.9413, aux.loss_ce: 0.1292, aux.acc_seg: 87.3307, loss: 0.3841, grad_norm: 3.4480
2023-02-19 06:58:28,054 - mmseg - INFO - Iter [36650/160000]	lr: 4.626e-05, eta: 9:47:44, time: 0.315, data_time: 0.046, memory: 15214, decode.loss_ce: 0.2339, decode.acc_seg: 90.8689, aux.loss_ce: 0.1171, aux.acc_seg: 88.2211, loss: 0.3509, grad_norm: 3.5564
2023-02-19 06:58:41,932 - mmseg - INFO - Iter [36700/160000]	lr: 4.624e-05, eta: 9:47:29, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2371, decode.acc_seg: 90.6990, aux.loss_ce: 0.1183, aux.acc_seg: 88.4775, loss: 0.3554, grad_norm: 3.3332
2023-02-19 06:58:56,239 - mmseg - INFO - Iter [36750/160000]	lr: 4.622e-05, eta: 9:47:14, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2392, decode.acc_seg: 90.5652, aux.loss_ce: 0.1217, aux.acc_seg: 88.2050, loss: 0.3609, grad_norm: 3.3127
2023-02-19 06:59:09,891 - mmseg - INFO - Iter [36800/160000]	lr: 4.620e-05, eta: 9:46:58, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2292, decode.acc_seg: 90.7429, aux.loss_ce: 0.1171, aux.acc_seg: 88.2484, loss: 0.3463, grad_norm: 3.3944
2023-02-19 06:59:23,632 - mmseg - INFO - Iter [36850/160000]	lr: 4.618e-05, eta: 9:46:42, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2298, decode.acc_seg: 91.0167, aux.loss_ce: 0.1180, aux.acc_seg: 88.4151, loss: 0.3478, grad_norm: 3.0884
2023-02-19 06:59:37,343 - mmseg - INFO - Iter [36900/160000]	lr: 4.616e-05, eta: 9:46:26, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2266, decode.acc_seg: 90.9327, aux.loss_ce: 0.1154, aux.acc_seg: 88.7851, loss: 0.3420, grad_norm: 3.0643
2023-02-19 06:59:51,678 - mmseg - INFO - Iter [36950/160000]	lr: 4.614e-05, eta: 9:46:11, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2320, decode.acc_seg: 90.7197, aux.loss_ce: 0.1160, aux.acc_seg: 88.5097, loss: 0.3479, grad_norm: 3.3002
2023-02-19 07:00:05,702 - mmseg - INFO - Saving checkpoint at 37000 iterations
2023-02-19 07:00:08,909 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:00:08,909 - mmseg - INFO - Iter [37000/160000]	lr: 4.613e-05, eta: 9:46:07, time: 0.345, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2401, decode.acc_seg: 90.5348, aux.loss_ce: 0.1259, aux.acc_seg: 87.6997, loss: 0.3660, grad_norm: 3.5575
2023-02-19 07:00:23,988 - mmseg - INFO - Iter [37050/160000]	lr: 4.611e-05, eta: 9:45:55, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2420, decode.acc_seg: 90.7172, aux.loss_ce: 0.1247, aux.acc_seg: 87.9813, loss: 0.3668, grad_norm: 3.9867
2023-02-19 07:00:37,527 - mmseg - INFO - Iter [37100/160000]	lr: 4.609e-05, eta: 9:45:38, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2315, decode.acc_seg: 90.7756, aux.loss_ce: 0.1182, aux.acc_seg: 88.2524, loss: 0.3497, grad_norm: 3.2129
2023-02-19 07:00:51,140 - mmseg - INFO - Iter [37150/160000]	lr: 4.607e-05, eta: 9:45:22, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2336, decode.acc_seg: 90.5924, aux.loss_ce: 0.1219, aux.acc_seg: 87.6843, loss: 0.3555, grad_norm: 3.3982
2023-02-19 07:01:05,831 - mmseg - INFO - Iter [37200/160000]	lr: 4.605e-05, eta: 9:45:09, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2185, decode.acc_seg: 91.2640, aux.loss_ce: 0.1128, aux.acc_seg: 88.9172, loss: 0.3312, grad_norm: 2.7190
2023-02-19 07:01:20,367 - mmseg - INFO - Iter [37250/160000]	lr: 4.603e-05, eta: 9:44:55, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2525, decode.acc_seg: 90.2540, aux.loss_ce: 0.1275, aux.acc_seg: 87.6507, loss: 0.3800, grad_norm: 4.3430
2023-02-19 07:01:34,700 - mmseg - INFO - Iter [37300/160000]	lr: 4.601e-05, eta: 9:44:41, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2383, decode.acc_seg: 90.6299, aux.loss_ce: 0.1236, aux.acc_seg: 87.9350, loss: 0.3619, grad_norm: 3.0534
2023-02-19 07:01:48,564 - mmseg - INFO - Iter [37350/160000]	lr: 4.599e-05, eta: 9:44:25, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2294, decode.acc_seg: 90.7121, aux.loss_ce: 0.1192, aux.acc_seg: 87.9569, loss: 0.3486, grad_norm: 3.1310
2023-02-19 07:02:02,931 - mmseg - INFO - Iter [37400/160000]	lr: 4.598e-05, eta: 9:44:11, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2293, decode.acc_seg: 91.0213, aux.loss_ce: 0.1210, aux.acc_seg: 88.1454, loss: 0.3503, grad_norm: 3.5512
2023-02-19 07:02:16,829 - mmseg - INFO - Iter [37450/160000]	lr: 4.596e-05, eta: 9:43:56, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2292, decode.acc_seg: 90.8848, aux.loss_ce: 0.1158, aux.acc_seg: 88.5274, loss: 0.3450, grad_norm: 2.9808
2023-02-19 07:02:30,346 - mmseg - INFO - Iter [37500/160000]	lr: 4.594e-05, eta: 9:43:39, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2447, decode.acc_seg: 90.3323, aux.loss_ce: 0.1281, aux.acc_seg: 87.1938, loss: 0.3728, grad_norm: 3.3827
2023-02-19 07:02:44,484 - mmseg - INFO - Iter [37550/160000]	lr: 4.592e-05, eta: 9:43:24, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2215, decode.acc_seg: 91.2576, aux.loss_ce: 0.1141, aux.acc_seg: 88.6694, loss: 0.3357, grad_norm: 2.7799
2023-02-19 07:02:58,594 - mmseg - INFO - Iter [37600/160000]	lr: 4.590e-05, eta: 9:43:09, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2330, decode.acc_seg: 91.0883, aux.loss_ce: 0.1200, aux.acc_seg: 88.3318, loss: 0.3530, grad_norm: 3.1253
2023-02-19 07:03:12,665 - mmseg - INFO - Iter [37650/160000]	lr: 4.588e-05, eta: 9:42:54, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2226, decode.acc_seg: 91.0433, aux.loss_ce: 0.1138, aux.acc_seg: 88.5747, loss: 0.3365, grad_norm: 4.1599
2023-02-19 07:03:26,666 - mmseg - INFO - Iter [37700/160000]	lr: 4.586e-05, eta: 9:42:39, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2386, decode.acc_seg: 90.6484, aux.loss_ce: 0.1224, aux.acc_seg: 87.9634, loss: 0.3609, grad_norm: 3.6631
2023-02-19 07:03:40,812 - mmseg - INFO - Iter [37750/160000]	lr: 4.584e-05, eta: 9:42:24, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2505, decode.acc_seg: 89.9181, aux.loss_ce: 0.1270, aux.acc_seg: 87.4284, loss: 0.3776, grad_norm: 3.2700
2023-02-19 07:03:54,440 - mmseg - INFO - Iter [37800/160000]	lr: 4.583e-05, eta: 9:42:08, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2376, decode.acc_seg: 90.2669, aux.loss_ce: 0.1219, aux.acc_seg: 88.0082, loss: 0.3595, grad_norm: 4.2523
2023-02-19 07:04:09,183 - mmseg - INFO - Iter [37850/160000]	lr: 4.581e-05, eta: 9:41:55, time: 0.295, data_time: 0.006, memory: 15214, decode.loss_ce: 0.2317, decode.acc_seg: 90.7819, aux.loss_ce: 0.1202, aux.acc_seg: 88.2119, loss: 0.3519, grad_norm: 2.9188
2023-02-19 07:04:25,002 - mmseg - INFO - Iter [37900/160000]	lr: 4.579e-05, eta: 9:41:46, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.2288, decode.acc_seg: 90.8000, aux.loss_ce: 0.1170, aux.acc_seg: 88.4048, loss: 0.3458, grad_norm: 3.3689
2023-02-19 07:04:38,635 - mmseg - INFO - Iter [37950/160000]	lr: 4.577e-05, eta: 9:41:29, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2314, decode.acc_seg: 90.9302, aux.loss_ce: 0.1185, aux.acc_seg: 88.4366, loss: 0.3499, grad_norm: 5.3554
2023-02-19 07:04:52,759 - mmseg - INFO - Saving checkpoint at 38000 iterations
2023-02-19 07:04:56,010 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:04:56,011 - mmseg - INFO - Iter [38000/160000]	lr: 4.575e-05, eta: 9:41:25, time: 0.347, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2389, decode.acc_seg: 90.6790, aux.loss_ce: 0.1281, aux.acc_seg: 87.4964, loss: 0.3670, grad_norm: 4.0199
2023-02-19 07:05:09,825 - mmseg - INFO - Iter [38050/160000]	lr: 4.573e-05, eta: 9:41:09, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2308, decode.acc_seg: 90.4884, aux.loss_ce: 0.1168, aux.acc_seg: 88.2031, loss: 0.3475, grad_norm: 4.2340
2023-02-19 07:05:24,264 - mmseg - INFO - Iter [38100/160000]	lr: 4.571e-05, eta: 9:40:55, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2298, decode.acc_seg: 90.8545, aux.loss_ce: 0.1179, aux.acc_seg: 88.2955, loss: 0.3477, grad_norm: 3.7757
2023-02-19 07:05:38,262 - mmseg - INFO - Iter [38150/160000]	lr: 4.569e-05, eta: 9:40:40, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2161, decode.acc_seg: 91.4528, aux.loss_ce: 0.1106, aux.acc_seg: 89.0311, loss: 0.3266, grad_norm: 2.6042
2023-02-19 07:05:52,024 - mmseg - INFO - Iter [38200/160000]	lr: 4.568e-05, eta: 9:40:24, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2325, decode.acc_seg: 90.6450, aux.loss_ce: 0.1207, aux.acc_seg: 88.1872, loss: 0.3532, grad_norm: 3.2006
2023-02-19 07:06:05,892 - mmseg - INFO - Iter [38250/160000]	lr: 4.566e-05, eta: 9:40:08, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2373, decode.acc_seg: 90.5441, aux.loss_ce: 0.1244, aux.acc_seg: 87.6519, loss: 0.3617, grad_norm: 4.5570
2023-02-19 07:06:19,433 - mmseg - INFO - Iter [38300/160000]	lr: 4.564e-05, eta: 9:39:51, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2302, decode.acc_seg: 90.6939, aux.loss_ce: 0.1178, aux.acc_seg: 87.9927, loss: 0.3480, grad_norm: 3.1985
2023-02-19 07:06:33,131 - mmseg - INFO - Iter [38350/160000]	lr: 4.562e-05, eta: 9:39:35, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2216, decode.acc_seg: 91.2021, aux.loss_ce: 0.1151, aux.acc_seg: 88.7618, loss: 0.3368, grad_norm: 2.8680
2023-02-19 07:06:46,699 - mmseg - INFO - Iter [38400/160000]	lr: 4.560e-05, eta: 9:39:19, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2219, decode.acc_seg: 91.2086, aux.loss_ce: 0.1146, aux.acc_seg: 88.7566, loss: 0.3365, grad_norm: 2.9018
2023-02-19 07:07:00,629 - mmseg - INFO - Iter [38450/160000]	lr: 4.558e-05, eta: 9:39:03, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2411, decode.acc_seg: 90.6170, aux.loss_ce: 0.1216, aux.acc_seg: 88.0038, loss: 0.3627, grad_norm: 3.1408
2023-02-19 07:07:14,432 - mmseg - INFO - Iter [38500/160000]	lr: 4.556e-05, eta: 9:38:47, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2224, decode.acc_seg: 90.9000, aux.loss_ce: 0.1112, aux.acc_seg: 88.8502, loss: 0.3335, grad_norm: 2.9962
2023-02-19 07:07:28,362 - mmseg - INFO - Iter [38550/160000]	lr: 4.554e-05, eta: 9:38:32, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2290, decode.acc_seg: 91.0213, aux.loss_ce: 0.1206, aux.acc_seg: 88.2133, loss: 0.3496, grad_norm: 2.7113
2023-02-19 07:07:41,975 - mmseg - INFO - Iter [38600/160000]	lr: 4.553e-05, eta: 9:38:16, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2324, decode.acc_seg: 90.9557, aux.loss_ce: 0.1195, aux.acc_seg: 88.5134, loss: 0.3519, grad_norm: 3.4151
2023-02-19 07:07:55,770 - mmseg - INFO - Iter [38650/160000]	lr: 4.551e-05, eta: 9:38:00, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2351, decode.acc_seg: 90.8448, aux.loss_ce: 0.1183, aux.acc_seg: 88.4552, loss: 0.3534, grad_norm: 3.1096
2023-02-19 07:08:09,571 - mmseg - INFO - Iter [38700/160000]	lr: 4.549e-05, eta: 9:37:44, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2292, decode.acc_seg: 91.0964, aux.loss_ce: 0.1163, aux.acc_seg: 88.5286, loss: 0.3455, grad_norm: 3.3669
2023-02-19 07:08:23,744 - mmseg - INFO - Iter [38750/160000]	lr: 4.547e-05, eta: 9:37:29, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2371, decode.acc_seg: 90.4790, aux.loss_ce: 0.1226, aux.acc_seg: 87.8433, loss: 0.3597, grad_norm: 3.0937
2023-02-19 07:08:37,473 - mmseg - INFO - Iter [38800/160000]	lr: 4.545e-05, eta: 9:37:13, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2304, decode.acc_seg: 90.7843, aux.loss_ce: 0.1153, aux.acc_seg: 88.3808, loss: 0.3457, grad_norm: 2.7180
2023-02-19 07:08:51,144 - mmseg - INFO - Iter [38850/160000]	lr: 4.543e-05, eta: 9:36:57, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2298, decode.acc_seg: 90.7481, aux.loss_ce: 0.1183, aux.acc_seg: 88.1831, loss: 0.3481, grad_norm: 3.2287
2023-02-19 07:09:04,907 - mmseg - INFO - Iter [38900/160000]	lr: 4.541e-05, eta: 9:36:41, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2417, decode.acc_seg: 90.4494, aux.loss_ce: 0.1211, aux.acc_seg: 87.8279, loss: 0.3628, grad_norm: 3.2195
2023-02-19 07:09:18,825 - mmseg - INFO - Iter [38950/160000]	lr: 4.539e-05, eta: 9:36:26, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2374, decode.acc_seg: 90.6635, aux.loss_ce: 0.1214, aux.acc_seg: 88.1084, loss: 0.3588, grad_norm: 2.7600
2023-02-19 07:09:32,924 - mmseg - INFO - Saving checkpoint at 39000 iterations
2023-02-19 07:09:36,128 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:09:36,128 - mmseg - INFO - Iter [39000/160000]	lr: 4.538e-05, eta: 9:36:21, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2333, decode.acc_seg: 90.8961, aux.loss_ce: 0.1199, aux.acc_seg: 88.4085, loss: 0.3532, grad_norm: 2.9195
2023-02-19 07:09:49,687 - mmseg - INFO - Iter [39050/160000]	lr: 4.536e-05, eta: 9:36:04, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2355, decode.acc_seg: 90.5171, aux.loss_ce: 0.1196, aux.acc_seg: 87.9411, loss: 0.3552, grad_norm: 3.0839
2023-02-19 07:10:03,911 - mmseg - INFO - Iter [39100/160000]	lr: 4.534e-05, eta: 9:35:50, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2153, decode.acc_seg: 91.3799, aux.loss_ce: 0.1134, aux.acc_seg: 88.8529, loss: 0.3288, grad_norm: 2.5757
2023-02-19 07:10:17,525 - mmseg - INFO - Iter [39150/160000]	lr: 4.532e-05, eta: 9:35:33, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2331, decode.acc_seg: 91.0114, aux.loss_ce: 0.1163, aux.acc_seg: 88.8999, loss: 0.3494, grad_norm: 3.1258
2023-02-19 07:10:33,402 - mmseg - INFO - Iter [39200/160000]	lr: 4.530e-05, eta: 9:35:24, time: 0.318, data_time: 0.047, memory: 15214, decode.loss_ce: 0.2223, decode.acc_seg: 91.1031, aux.loss_ce: 0.1134, aux.acc_seg: 88.7068, loss: 0.3357, grad_norm: 2.3898
2023-02-19 07:10:47,627 - mmseg - INFO - Iter [39250/160000]	lr: 4.528e-05, eta: 9:35:09, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2201, decode.acc_seg: 91.1080, aux.loss_ce: 0.1130, aux.acc_seg: 88.6643, loss: 0.3331, grad_norm: 2.9410
2023-02-19 07:11:01,349 - mmseg - INFO - Iter [39300/160000]	lr: 4.526e-05, eta: 9:34:53, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2333, decode.acc_seg: 90.8520, aux.loss_ce: 0.1187, aux.acc_seg: 88.5543, loss: 0.3521, grad_norm: 3.1137
2023-02-19 07:11:15,444 - mmseg - INFO - Iter [39350/160000]	lr: 4.524e-05, eta: 9:34:38, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2269, decode.acc_seg: 91.0058, aux.loss_ce: 0.1212, aux.acc_seg: 88.1311, loss: 0.3481, grad_norm: 3.2652
2023-02-19 07:11:29,362 - mmseg - INFO - Iter [39400/160000]	lr: 4.523e-05, eta: 9:34:23, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2079, decode.acc_seg: 91.4989, aux.loss_ce: 0.1084, aux.acc_seg: 88.9583, loss: 0.3163, grad_norm: 2.8631
2023-02-19 07:11:43,291 - mmseg - INFO - Iter [39450/160000]	lr: 4.521e-05, eta: 9:34:08, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2296, decode.acc_seg: 90.9449, aux.loss_ce: 0.1195, aux.acc_seg: 88.3805, loss: 0.3491, grad_norm: 2.9336
2023-02-19 07:11:56,886 - mmseg - INFO - Iter [39500/160000]	lr: 4.519e-05, eta: 9:33:51, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2332, decode.acc_seg: 90.9173, aux.loss_ce: 0.1225, aux.acc_seg: 88.1637, loss: 0.3557, grad_norm: 3.1468
2023-02-19 07:12:10,444 - mmseg - INFO - Iter [39550/160000]	lr: 4.517e-05, eta: 9:33:35, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2262, decode.acc_seg: 90.8033, aux.loss_ce: 0.1144, aux.acc_seg: 88.3880, loss: 0.3407, grad_norm: 3.2271
2023-02-19 07:12:24,527 - mmseg - INFO - Iter [39600/160000]	lr: 4.515e-05, eta: 9:33:20, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2309, decode.acc_seg: 91.0916, aux.loss_ce: 0.1210, aux.acc_seg: 88.5696, loss: 0.3518, grad_norm: 3.3669
2023-02-19 07:12:38,104 - mmseg - INFO - Iter [39650/160000]	lr: 4.513e-05, eta: 9:33:03, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2333, decode.acc_seg: 90.8014, aux.loss_ce: 0.1199, aux.acc_seg: 88.2816, loss: 0.3532, grad_norm: 3.5298
2023-02-19 07:12:52,746 - mmseg - INFO - Iter [39700/160000]	lr: 4.511e-05, eta: 9:32:50, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2227, decode.acc_seg: 91.1985, aux.loss_ce: 0.1134, aux.acc_seg: 88.6343, loss: 0.3361, grad_norm: 3.0154
2023-02-19 07:13:07,752 - mmseg - INFO - Iter [39750/160000]	lr: 4.509e-05, eta: 9:32:38, time: 0.301, data_time: 0.006, memory: 15214, decode.loss_ce: 0.2092, decode.acc_seg: 91.4898, aux.loss_ce: 0.1089, aux.acc_seg: 88.8865, loss: 0.3181, grad_norm: 2.3843
2023-02-19 07:13:22,225 - mmseg - INFO - Iter [39800/160000]	lr: 4.508e-05, eta: 9:32:24, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2239, decode.acc_seg: 91.0455, aux.loss_ce: 0.1120, aux.acc_seg: 88.6895, loss: 0.3359, grad_norm: 2.9098
2023-02-19 07:13:36,280 - mmseg - INFO - Iter [39850/160000]	lr: 4.506e-05, eta: 9:32:09, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2286, decode.acc_seg: 91.2383, aux.loss_ce: 0.1192, aux.acc_seg: 88.5720, loss: 0.3478, grad_norm: 3.3484
2023-02-19 07:13:49,864 - mmseg - INFO - Iter [39900/160000]	lr: 4.504e-05, eta: 9:31:53, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2250, decode.acc_seg: 90.9452, aux.loss_ce: 0.1151, aux.acc_seg: 88.4493, loss: 0.3401, grad_norm: 3.4868
2023-02-19 07:14:03,951 - mmseg - INFO - Iter [39950/160000]	lr: 4.502e-05, eta: 9:31:38, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2326, decode.acc_seg: 90.7639, aux.loss_ce: 0.1185, aux.acc_seg: 88.1822, loss: 0.3511, grad_norm: 3.5226
2023-02-19 07:14:18,279 - mmseg - INFO - Saving checkpoint at 40000 iterations
2023-02-19 07:14:21,576 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:14:21,576 - mmseg - INFO - Iter [40000/160000]	lr: 4.500e-05, eta: 9:31:34, time: 0.353, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2126, decode.acc_seg: 91.4395, aux.loss_ce: 0.1145, aux.acc_seg: 88.5343, loss: 0.3271, grad_norm: 3.1780
2023-02-19 07:14:35,294 - mmseg - INFO - Iter [40050/160000]	lr: 4.498e-05, eta: 9:31:18, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2353, decode.acc_seg: 90.6293, aux.loss_ce: 0.1221, aux.acc_seg: 88.0023, loss: 0.3574, grad_norm: 3.3673
2023-02-19 07:14:48,893 - mmseg - INFO - Iter [40100/160000]	lr: 4.496e-05, eta: 9:31:02, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2133, decode.acc_seg: 91.6921, aux.loss_ce: 0.1137, aux.acc_seg: 89.0494, loss: 0.3270, grad_norm: 3.7431
2023-02-19 07:15:02,847 - mmseg - INFO - Iter [40150/160000]	lr: 4.494e-05, eta: 9:30:46, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2174, decode.acc_seg: 91.2619, aux.loss_ce: 0.1193, aux.acc_seg: 88.0327, loss: 0.3367, grad_norm: 2.6141
2023-02-19 07:15:16,682 - mmseg - INFO - Iter [40200/160000]	lr: 4.493e-05, eta: 9:30:31, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2336, decode.acc_seg: 90.7018, aux.loss_ce: 0.1243, aux.acc_seg: 87.7110, loss: 0.3578, grad_norm: 3.4257
2023-02-19 07:15:30,831 - mmseg - INFO - Iter [40250/160000]	lr: 4.491e-05, eta: 9:30:16, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2381, decode.acc_seg: 90.5856, aux.loss_ce: 0.1199, aux.acc_seg: 88.0796, loss: 0.3581, grad_norm: 3.5299
2023-02-19 07:15:44,787 - mmseg - INFO - Iter [40300/160000]	lr: 4.489e-05, eta: 9:30:01, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2336, decode.acc_seg: 90.8135, aux.loss_ce: 0.1200, aux.acc_seg: 88.3497, loss: 0.3536, grad_norm: 3.4006
2023-02-19 07:15:58,817 - mmseg - INFO - Iter [40350/160000]	lr: 4.487e-05, eta: 9:29:45, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2272, decode.acc_seg: 90.5559, aux.loss_ce: 0.1127, aux.acc_seg: 88.4570, loss: 0.3399, grad_norm: 3.3507
2023-02-19 07:16:12,646 - mmseg - INFO - Iter [40400/160000]	lr: 4.485e-05, eta: 9:29:30, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2076, decode.acc_seg: 91.6246, aux.loss_ce: 0.1084, aux.acc_seg: 89.2251, loss: 0.3160, grad_norm: 2.7559
2023-02-19 07:16:28,910 - mmseg - INFO - Iter [40450/160000]	lr: 4.483e-05, eta: 9:29:22, time: 0.325, data_time: 0.047, memory: 15214, decode.loss_ce: 0.2304, decode.acc_seg: 90.7869, aux.loss_ce: 0.1173, aux.acc_seg: 88.3471, loss: 0.3477, grad_norm: 3.2219
2023-02-19 07:16:43,063 - mmseg - INFO - Iter [40500/160000]	lr: 4.481e-05, eta: 9:29:07, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2173, decode.acc_seg: 91.3738, aux.loss_ce: 0.1122, aux.acc_seg: 88.9953, loss: 0.3295, grad_norm: 2.8421
2023-02-19 07:16:56,920 - mmseg - INFO - Iter [40550/160000]	lr: 4.479e-05, eta: 9:28:51, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2070, decode.acc_seg: 91.4643, aux.loss_ce: 0.1064, aux.acc_seg: 89.1124, loss: 0.3133, grad_norm: 2.6714
2023-02-19 07:17:10,646 - mmseg - INFO - Iter [40600/160000]	lr: 4.478e-05, eta: 9:28:35, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2161, decode.acc_seg: 91.4406, aux.loss_ce: 0.1140, aux.acc_seg: 88.6380, loss: 0.3301, grad_norm: 2.6327
2023-02-19 07:17:24,672 - mmseg - INFO - Iter [40650/160000]	lr: 4.476e-05, eta: 9:28:20, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2199, decode.acc_seg: 91.3130, aux.loss_ce: 0.1136, aux.acc_seg: 88.5380, loss: 0.3335, grad_norm: 2.8791
2023-02-19 07:17:38,256 - mmseg - INFO - Iter [40700/160000]	lr: 4.474e-05, eta: 9:28:04, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2080, decode.acc_seg: 91.5090, aux.loss_ce: 0.1076, aux.acc_seg: 89.1545, loss: 0.3156, grad_norm: 2.8737
2023-02-19 07:17:51,864 - mmseg - INFO - Iter [40750/160000]	lr: 4.472e-05, eta: 9:27:48, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2154, decode.acc_seg: 91.6133, aux.loss_ce: 0.1125, aux.acc_seg: 88.9630, loss: 0.3279, grad_norm: 3.1980
2023-02-19 07:18:05,807 - mmseg - INFO - Iter [40800/160000]	lr: 4.470e-05, eta: 9:27:32, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2205, decode.acc_seg: 91.2940, aux.loss_ce: 0.1149, aux.acc_seg: 88.6118, loss: 0.3354, grad_norm: 2.9550
2023-02-19 07:18:19,931 - mmseg - INFO - Iter [40850/160000]	lr: 4.468e-05, eta: 9:27:18, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2134, decode.acc_seg: 91.4283, aux.loss_ce: 0.1134, aux.acc_seg: 88.8288, loss: 0.3268, grad_norm: 2.9762
2023-02-19 07:18:33,974 - mmseg - INFO - Iter [40900/160000]	lr: 4.466e-05, eta: 9:27:03, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2175, decode.acc_seg: 91.3066, aux.loss_ce: 0.1097, aux.acc_seg: 88.9289, loss: 0.3272, grad_norm: 2.5184
2023-02-19 07:18:47,697 - mmseg - INFO - Iter [40950/160000]	lr: 4.464e-05, eta: 9:26:47, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2309, decode.acc_seg: 90.6314, aux.loss_ce: 0.1200, aux.acc_seg: 88.0179, loss: 0.3509, grad_norm: 4.1239
2023-02-19 07:19:01,368 - mmseg - INFO - Saving checkpoint at 41000 iterations
2023-02-19 07:19:04,773 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:19:04,773 - mmseg - INFO - Iter [41000/160000]	lr: 4.463e-05, eta: 9:26:41, time: 0.342, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2168, decode.acc_seg: 91.5003, aux.loss_ce: 0.1145, aux.acc_seg: 88.8624, loss: 0.3313, grad_norm: 2.9586
2023-02-19 07:19:18,456 - mmseg - INFO - Iter [41050/160000]	lr: 4.461e-05, eta: 9:26:25, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2317, decode.acc_seg: 90.7907, aux.loss_ce: 0.1201, aux.acc_seg: 88.2213, loss: 0.3518, grad_norm: 2.8217
2023-02-19 07:19:33,371 - mmseg - INFO - Iter [41100/160000]	lr: 4.459e-05, eta: 9:26:12, time: 0.298, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2087, decode.acc_seg: 91.6740, aux.loss_ce: 0.1082, aux.acc_seg: 89.1610, loss: 0.3169, grad_norm: 2.6193
2023-02-19 07:19:47,668 - mmseg - INFO - Iter [41150/160000]	lr: 4.457e-05, eta: 9:25:58, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2285, decode.acc_seg: 90.9060, aux.loss_ce: 0.1182, aux.acc_seg: 88.3196, loss: 0.3467, grad_norm: 3.0213
2023-02-19 07:20:01,618 - mmseg - INFO - Iter [41200/160000]	lr: 4.455e-05, eta: 9:25:43, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2213, decode.acc_seg: 91.3138, aux.loss_ce: 0.1159, aux.acc_seg: 88.5626, loss: 0.3372, grad_norm: 2.8874
2023-02-19 07:20:15,832 - mmseg - INFO - Iter [41250/160000]	lr: 4.453e-05, eta: 9:25:28, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2284, decode.acc_seg: 91.1364, aux.loss_ce: 0.1192, aux.acc_seg: 88.3349, loss: 0.3476, grad_norm: 2.9200
2023-02-19 07:20:30,465 - mmseg - INFO - Iter [41300/160000]	lr: 4.451e-05, eta: 9:25:15, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2255, decode.acc_seg: 91.1338, aux.loss_ce: 0.1132, aux.acc_seg: 88.7997, loss: 0.3386, grad_norm: 2.8436
2023-02-19 07:20:44,212 - mmseg - INFO - Iter [41350/160000]	lr: 4.449e-05, eta: 9:24:59, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2132, decode.acc_seg: 91.5072, aux.loss_ce: 0.1126, aux.acc_seg: 88.9146, loss: 0.3257, grad_norm: 2.4777
2023-02-19 07:20:57,985 - mmseg - INFO - Iter [41400/160000]	lr: 4.448e-05, eta: 9:24:43, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2196, decode.acc_seg: 91.2613, aux.loss_ce: 0.1168, aux.acc_seg: 88.4388, loss: 0.3364, grad_norm: 3.0019
2023-02-19 07:21:12,491 - mmseg - INFO - Iter [41450/160000]	lr: 4.446e-05, eta: 9:24:30, time: 0.291, data_time: 0.006, memory: 15214, decode.loss_ce: 0.2142, decode.acc_seg: 91.1385, aux.loss_ce: 0.1136, aux.acc_seg: 88.4270, loss: 0.3278, grad_norm: 2.9875
2023-02-19 07:21:26,745 - mmseg - INFO - Iter [41500/160000]	lr: 4.444e-05, eta: 9:24:15, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2193, decode.acc_seg: 91.1060, aux.loss_ce: 0.1134, aux.acc_seg: 88.6180, loss: 0.3327, grad_norm: 2.7064
2023-02-19 07:21:40,393 - mmseg - INFO - Iter [41550/160000]	lr: 4.442e-05, eta: 9:23:59, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2141, decode.acc_seg: 91.4871, aux.loss_ce: 0.1130, aux.acc_seg: 88.6929, loss: 0.3271, grad_norm: 3.1354
2023-02-19 07:21:55,336 - mmseg - INFO - Iter [41600/160000]	lr: 4.440e-05, eta: 9:23:47, time: 0.299, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2206, decode.acc_seg: 91.2652, aux.loss_ce: 0.1153, aux.acc_seg: 88.3305, loss: 0.3360, grad_norm: 3.2438
2023-02-19 07:22:09,577 - mmseg - INFO - Iter [41650/160000]	lr: 4.438e-05, eta: 9:23:32, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2287, decode.acc_seg: 90.9636, aux.loss_ce: 0.1177, aux.acc_seg: 88.2880, loss: 0.3464, grad_norm: 2.6814
2023-02-19 07:22:25,940 - mmseg - INFO - Iter [41700/160000]	lr: 4.436e-05, eta: 9:23:24, time: 0.327, data_time: 0.048, memory: 15214, decode.loss_ce: 0.2116, decode.acc_seg: 91.6884, aux.loss_ce: 0.1136, aux.acc_seg: 88.8425, loss: 0.3252, grad_norm: 2.4360
2023-02-19 07:22:39,646 - mmseg - INFO - Iter [41750/160000]	lr: 4.434e-05, eta: 9:23:08, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2284, decode.acc_seg: 91.0290, aux.loss_ce: 0.1156, aux.acc_seg: 88.5444, loss: 0.3440, grad_norm: 2.5472
2023-02-19 07:22:53,871 - mmseg - INFO - Iter [41800/160000]	lr: 4.433e-05, eta: 9:22:53, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2102, decode.acc_seg: 91.6968, aux.loss_ce: 0.1115, aux.acc_seg: 89.1002, loss: 0.3217, grad_norm: 2.4190
2023-02-19 07:23:08,175 - mmseg - INFO - Iter [41850/160000]	lr: 4.431e-05, eta: 9:22:39, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2165, decode.acc_seg: 91.4403, aux.loss_ce: 0.1128, aux.acc_seg: 88.8592, loss: 0.3294, grad_norm: 2.7240
2023-02-19 07:23:22,241 - mmseg - INFO - Iter [41900/160000]	lr: 4.429e-05, eta: 9:22:24, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2230, decode.acc_seg: 91.2134, aux.loss_ce: 0.1167, aux.acc_seg: 88.4386, loss: 0.3397, grad_norm: 3.4449
2023-02-19 07:23:36,639 - mmseg - INFO - Iter [41950/160000]	lr: 4.427e-05, eta: 9:22:10, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2200, decode.acc_seg: 91.3777, aux.loss_ce: 0.1149, aux.acc_seg: 88.9100, loss: 0.3349, grad_norm: 2.9888
2023-02-19 07:23:50,592 - mmseg - INFO - Saving checkpoint at 42000 iterations
2023-02-19 07:23:53,825 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:23:53,825 - mmseg - INFO - Iter [42000/160000]	lr: 4.425e-05, eta: 9:22:04, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2164, decode.acc_seg: 91.5008, aux.loss_ce: 0.1133, aux.acc_seg: 88.9782, loss: 0.3297, grad_norm: 3.0548
2023-02-19 07:24:07,456 - mmseg - INFO - Iter [42050/160000]	lr: 4.423e-05, eta: 9:21:48, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2000, decode.acc_seg: 91.8153, aux.loss_ce: 0.1071, aux.acc_seg: 89.1911, loss: 0.3071, grad_norm: 2.5978
2023-02-19 07:24:21,068 - mmseg - INFO - Iter [42100/160000]	lr: 4.421e-05, eta: 9:21:32, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2116, decode.acc_seg: 91.3992, aux.loss_ce: 0.1092, aux.acc_seg: 88.9522, loss: 0.3208, grad_norm: 2.3970
2023-02-19 07:24:35,083 - mmseg - INFO - Iter [42150/160000]	lr: 4.419e-05, eta: 9:21:17, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2167, decode.acc_seg: 91.2818, aux.loss_ce: 0.1117, aux.acc_seg: 88.7609, loss: 0.3284, grad_norm: 2.9390
2023-02-19 07:24:49,107 - mmseg - INFO - Iter [42200/160000]	lr: 4.418e-05, eta: 9:21:02, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2134, decode.acc_seg: 91.4154, aux.loss_ce: 0.1113, aux.acc_seg: 88.6733, loss: 0.3247, grad_norm: 2.9394
2023-02-19 07:25:03,119 - mmseg - INFO - Iter [42250/160000]	lr: 4.416e-05, eta: 9:20:47, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2198, decode.acc_seg: 91.3586, aux.loss_ce: 0.1154, aux.acc_seg: 88.6355, loss: 0.3352, grad_norm: 2.9296
2023-02-19 07:25:16,744 - mmseg - INFO - Iter [42300/160000]	lr: 4.414e-05, eta: 9:20:31, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2137, decode.acc_seg: 91.6405, aux.loss_ce: 0.1146, aux.acc_seg: 88.7868, loss: 0.3283, grad_norm: 2.7218
2023-02-19 07:25:31,183 - mmseg - INFO - Iter [42350/160000]	lr: 4.412e-05, eta: 9:20:17, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2126, decode.acc_seg: 91.6028, aux.loss_ce: 0.1151, aux.acc_seg: 88.8228, loss: 0.3277, grad_norm: 2.4848
2023-02-19 07:25:44,877 - mmseg - INFO - Iter [42400/160000]	lr: 4.410e-05, eta: 9:20:01, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2130, decode.acc_seg: 91.2599, aux.loss_ce: 0.1138, aux.acc_seg: 88.7826, loss: 0.3267, grad_norm: 3.2738
2023-02-19 07:25:59,270 - mmseg - INFO - Iter [42450/160000]	lr: 4.408e-05, eta: 9:19:47, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2161, decode.acc_seg: 91.5970, aux.loss_ce: 0.1126, aux.acc_seg: 88.8946, loss: 0.3287, grad_norm: 3.7273
2023-02-19 07:26:14,141 - mmseg - INFO - Iter [42500/160000]	lr: 4.406e-05, eta: 9:19:34, time: 0.298, data_time: 0.006, memory: 15214, decode.loss_ce: 0.2183, decode.acc_seg: 91.4430, aux.loss_ce: 0.1125, aux.acc_seg: 88.9038, loss: 0.3308, grad_norm: 2.7785
2023-02-19 07:26:27,803 - mmseg - INFO - Iter [42550/160000]	lr: 4.404e-05, eta: 9:19:18, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2058, decode.acc_seg: 91.8342, aux.loss_ce: 0.1089, aux.acc_seg: 89.2251, loss: 0.3147, grad_norm: 2.8809
2023-02-19 07:26:41,580 - mmseg - INFO - Iter [42600/160000]	lr: 4.403e-05, eta: 9:19:02, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2088, decode.acc_seg: 91.5639, aux.loss_ce: 0.1099, aux.acc_seg: 89.0156, loss: 0.3187, grad_norm: 2.8675
2023-02-19 07:26:55,397 - mmseg - INFO - Iter [42650/160000]	lr: 4.401e-05, eta: 9:18:47, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2147, decode.acc_seg: 91.2868, aux.loss_ce: 0.1121, aux.acc_seg: 88.8811, loss: 0.3268, grad_norm: 2.8188
2023-02-19 07:27:09,616 - mmseg - INFO - Iter [42700/160000]	lr: 4.399e-05, eta: 9:18:32, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2112, decode.acc_seg: 91.8290, aux.loss_ce: 0.1113, aux.acc_seg: 89.3606, loss: 0.3224, grad_norm: 2.8167
2023-02-19 07:27:23,735 - mmseg - INFO - Iter [42750/160000]	lr: 4.397e-05, eta: 9:18:18, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2092, decode.acc_seg: 91.7687, aux.loss_ce: 0.1141, aux.acc_seg: 88.9030, loss: 0.3233, grad_norm: 2.9519
2023-02-19 07:27:38,215 - mmseg - INFO - Iter [42800/160000]	lr: 4.395e-05, eta: 9:18:04, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2123, decode.acc_seg: 91.2851, aux.loss_ce: 0.1114, aux.acc_seg: 88.8097, loss: 0.3237, grad_norm: 2.8687
2023-02-19 07:27:52,802 - mmseg - INFO - Iter [42850/160000]	lr: 4.393e-05, eta: 9:17:50, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2084, decode.acc_seg: 91.7476, aux.loss_ce: 0.1096, aux.acc_seg: 89.1046, loss: 0.3180, grad_norm: 2.7934
2023-02-19 07:28:06,824 - mmseg - INFO - Iter [42900/160000]	lr: 4.391e-05, eta: 9:17:35, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2098, decode.acc_seg: 91.5890, aux.loss_ce: 0.1120, aux.acc_seg: 89.0180, loss: 0.3218, grad_norm: 3.1236
2023-02-19 07:28:22,991 - mmseg - INFO - Iter [42950/160000]	lr: 4.389e-05, eta: 9:17:26, time: 0.323, data_time: 0.046, memory: 15214, decode.loss_ce: 0.2112, decode.acc_seg: 91.6021, aux.loss_ce: 0.1112, aux.acc_seg: 89.1119, loss: 0.3224, grad_norm: 3.1626
2023-02-19 07:28:36,533 - mmseg - INFO - Saving checkpoint at 43000 iterations
2023-02-19 07:28:39,769 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:28:39,769 - mmseg - INFO - Iter [43000/160000]	lr: 4.388e-05, eta: 9:17:19, time: 0.336, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2014, decode.acc_seg: 91.6300, aux.loss_ce: 0.1097, aux.acc_seg: 88.9355, loss: 0.3111, grad_norm: 2.8229
2023-02-19 07:28:53,860 - mmseg - INFO - Iter [43050/160000]	lr: 4.386e-05, eta: 9:17:04, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2115, decode.acc_seg: 91.3282, aux.loss_ce: 0.1108, aux.acc_seg: 88.7967, loss: 0.3223, grad_norm: 3.5393
2023-02-19 07:29:07,637 - mmseg - INFO - Iter [43100/160000]	lr: 4.384e-05, eta: 9:16:48, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2184, decode.acc_seg: 91.3126, aux.loss_ce: 0.1143, aux.acc_seg: 88.6173, loss: 0.3328, grad_norm: 3.9155
2023-02-19 07:29:21,364 - mmseg - INFO - Iter [43150/160000]	lr: 4.382e-05, eta: 9:16:32, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2259, decode.acc_seg: 91.2939, aux.loss_ce: 0.1164, aux.acc_seg: 88.6820, loss: 0.3423, grad_norm: 3.2547
2023-02-19 07:29:35,699 - mmseg - INFO - Iter [43200/160000]	lr: 4.380e-05, eta: 9:16:18, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2181, decode.acc_seg: 91.3485, aux.loss_ce: 0.1141, aux.acc_seg: 88.7417, loss: 0.3321, grad_norm: 3.0022
2023-02-19 07:29:49,473 - mmseg - INFO - Iter [43250/160000]	lr: 4.378e-05, eta: 9:16:03, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2268, decode.acc_seg: 91.1827, aux.loss_ce: 0.1170, aux.acc_seg: 88.6024, loss: 0.3439, grad_norm: 3.1252
2023-02-19 07:30:03,142 - mmseg - INFO - Iter [43300/160000]	lr: 4.376e-05, eta: 9:15:47, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2110, decode.acc_seg: 91.5026, aux.loss_ce: 0.1115, aux.acc_seg: 88.8030, loss: 0.3225, grad_norm: 2.7343
2023-02-19 07:30:16,861 - mmseg - INFO - Iter [43350/160000]	lr: 4.374e-05, eta: 9:15:31, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2122, decode.acc_seg: 91.5402, aux.loss_ce: 0.1065, aux.acc_seg: 89.2529, loss: 0.3188, grad_norm: 3.5472
2023-02-19 07:30:30,829 - mmseg - INFO - Iter [43400/160000]	lr: 4.373e-05, eta: 9:15:16, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1959, decode.acc_seg: 92.1374, aux.loss_ce: 0.1045, aux.acc_seg: 89.6493, loss: 0.3004, grad_norm: 2.5594
2023-02-19 07:30:44,714 - mmseg - INFO - Iter [43450/160000]	lr: 4.371e-05, eta: 9:15:00, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2067, decode.acc_seg: 91.6808, aux.loss_ce: 0.1120, aux.acc_seg: 88.8940, loss: 0.3187, grad_norm: 2.7610
2023-02-19 07:30:58,554 - mmseg - INFO - Iter [43500/160000]	lr: 4.369e-05, eta: 9:14:45, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2070, decode.acc_seg: 91.7367, aux.loss_ce: 0.1059, aux.acc_seg: 89.4470, loss: 0.3130, grad_norm: 3.0877
2023-02-19 07:31:12,580 - mmseg - INFO - Iter [43550/160000]	lr: 4.367e-05, eta: 9:14:30, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2262, decode.acc_seg: 90.9336, aux.loss_ce: 0.1191, aux.acc_seg: 88.0961, loss: 0.3453, grad_norm: 3.2542
2023-02-19 07:31:26,630 - mmseg - INFO - Iter [43600/160000]	lr: 4.365e-05, eta: 9:14:15, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2054, decode.acc_seg: 91.6455, aux.loss_ce: 0.1081, aux.acc_seg: 88.9486, loss: 0.3135, grad_norm: 2.8744
2023-02-19 07:31:40,264 - mmseg - INFO - Iter [43650/160000]	lr: 4.363e-05, eta: 9:13:59, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2070, decode.acc_seg: 91.6561, aux.loss_ce: 0.1079, aux.acc_seg: 89.1734, loss: 0.3149, grad_norm: 3.4310
2023-02-19 07:31:53,931 - mmseg - INFO - Iter [43700/160000]	lr: 4.361e-05, eta: 9:13:43, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2228, decode.acc_seg: 91.2369, aux.loss_ce: 0.1161, aux.acc_seg: 88.6898, loss: 0.3389, grad_norm: 3.6985
2023-02-19 07:32:08,035 - mmseg - INFO - Iter [43750/160000]	lr: 4.359e-05, eta: 9:13:28, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2135, decode.acc_seg: 91.4882, aux.loss_ce: 0.1101, aux.acc_seg: 89.0520, loss: 0.3236, grad_norm: 2.5264
2023-02-19 07:32:21,895 - mmseg - INFO - Iter [43800/160000]	lr: 4.358e-05, eta: 9:13:13, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2077, decode.acc_seg: 91.7792, aux.loss_ce: 0.1102, aux.acc_seg: 88.9128, loss: 0.3179, grad_norm: 2.5451
2023-02-19 07:32:36,323 - mmseg - INFO - Iter [43850/160000]	lr: 4.356e-05, eta: 9:12:59, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2255, decode.acc_seg: 91.2034, aux.loss_ce: 0.1186, aux.acc_seg: 88.3945, loss: 0.3442, grad_norm: 3.0487
2023-02-19 07:32:50,051 - mmseg - INFO - Iter [43900/160000]	lr: 4.354e-05, eta: 9:12:43, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2120, decode.acc_seg: 91.4462, aux.loss_ce: 0.1130, aux.acc_seg: 88.7520, loss: 0.3249, grad_norm: 2.6269
2023-02-19 07:33:04,467 - mmseg - INFO - Iter [43950/160000]	lr: 4.352e-05, eta: 9:12:29, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1997, decode.acc_seg: 91.9618, aux.loss_ce: 0.1088, aux.acc_seg: 89.0714, loss: 0.3084, grad_norm: 2.7536
2023-02-19 07:33:18,244 - mmseg - INFO - Saving checkpoint at 44000 iterations
2023-02-19 07:33:21,481 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:33:21,481 - mmseg - INFO - Iter [44000/160000]	lr: 4.350e-05, eta: 9:12:22, time: 0.340, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2072, decode.acc_seg: 91.7674, aux.loss_ce: 0.1099, aux.acc_seg: 89.1844, loss: 0.3171, grad_norm: 3.1078
2023-02-19 07:33:35,142 - mmseg - INFO - Iter [44050/160000]	lr: 4.348e-05, eta: 9:12:06, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2236, decode.acc_seg: 91.1797, aux.loss_ce: 0.1133, aux.acc_seg: 88.8425, loss: 0.3369, grad_norm: 2.9867
2023-02-19 07:33:49,056 - mmseg - INFO - Iter [44100/160000]	lr: 4.346e-05, eta: 9:11:51, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2164, decode.acc_seg: 91.5486, aux.loss_ce: 0.1128, aux.acc_seg: 88.9515, loss: 0.3292, grad_norm: 3.4633
2023-02-19 07:34:03,910 - mmseg - INFO - Iter [44150/160000]	lr: 4.344e-05, eta: 9:11:38, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2093, decode.acc_seg: 91.8274, aux.loss_ce: 0.1130, aux.acc_seg: 89.0070, loss: 0.3223, grad_norm: 2.8847
2023-02-19 07:34:18,760 - mmseg - INFO - Iter [44200/160000]	lr: 4.343e-05, eta: 9:11:25, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2005, decode.acc_seg: 92.1377, aux.loss_ce: 0.1088, aux.acc_seg: 89.2088, loss: 0.3093, grad_norm: 2.4658
2023-02-19 07:34:35,910 - mmseg - INFO - Iter [44250/160000]	lr: 4.341e-05, eta: 9:11:19, time: 0.344, data_time: 0.049, memory: 15214, decode.loss_ce: 0.2253, decode.acc_seg: 91.0874, aux.loss_ce: 0.1187, aux.acc_seg: 88.3510, loss: 0.3440, grad_norm: 3.1643
2023-02-19 07:34:49,970 - mmseg - INFO - Iter [44300/160000]	lr: 4.339e-05, eta: 9:11:04, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2090, decode.acc_seg: 91.6282, aux.loss_ce: 0.1120, aux.acc_seg: 88.8780, loss: 0.3211, grad_norm: 2.8892
2023-02-19 07:35:03,620 - mmseg - INFO - Iter [44350/160000]	lr: 4.337e-05, eta: 9:10:48, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1932, decode.acc_seg: 92.3057, aux.loss_ce: 0.1075, aux.acc_seg: 89.2357, loss: 0.3007, grad_norm: 2.6783
2023-02-19 07:35:17,176 - mmseg - INFO - Iter [44400/160000]	lr: 4.335e-05, eta: 9:10:32, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2020, decode.acc_seg: 91.7568, aux.loss_ce: 0.1074, aux.acc_seg: 89.1642, loss: 0.3094, grad_norm: 3.0137
2023-02-19 07:35:31,148 - mmseg - INFO - Iter [44450/160000]	lr: 4.333e-05, eta: 9:10:16, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2128, decode.acc_seg: 91.5614, aux.loss_ce: 0.1114, aux.acc_seg: 88.8514, loss: 0.3242, grad_norm: 2.6918
2023-02-19 07:35:45,938 - mmseg - INFO - Iter [44500/160000]	lr: 4.331e-05, eta: 9:10:03, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2087, decode.acc_seg: 91.3584, aux.loss_ce: 0.1114, aux.acc_seg: 88.7024, loss: 0.3201, grad_norm: 2.8704
2023-02-19 07:36:00,274 - mmseg - INFO - Iter [44550/160000]	lr: 4.329e-05, eta: 9:09:49, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2057, decode.acc_seg: 91.6859, aux.loss_ce: 0.1070, aux.acc_seg: 89.1184, loss: 0.3127, grad_norm: 2.4158
2023-02-19 07:36:13,989 - mmseg - INFO - Iter [44600/160000]	lr: 4.328e-05, eta: 9:09:33, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2106, decode.acc_seg: 91.3993, aux.loss_ce: 0.1118, aux.acc_seg: 88.4888, loss: 0.3224, grad_norm: 3.0936
2023-02-19 07:36:27,655 - mmseg - INFO - Iter [44650/160000]	lr: 4.326e-05, eta: 9:09:18, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1960, decode.acc_seg: 92.2361, aux.loss_ce: 0.1052, aux.acc_seg: 89.6311, loss: 0.3011, grad_norm: 2.6164
2023-02-19 07:36:41,673 - mmseg - INFO - Iter [44700/160000]	lr: 4.324e-05, eta: 9:09:03, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2144, decode.acc_seg: 91.1865, aux.loss_ce: 0.1091, aux.acc_seg: 88.8747, loss: 0.3234, grad_norm: 3.4353
2023-02-19 07:36:55,974 - mmseg - INFO - Iter [44750/160000]	lr: 4.322e-05, eta: 9:08:48, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1972, decode.acc_seg: 91.9388, aux.loss_ce: 0.1045, aux.acc_seg: 89.3421, loss: 0.3017, grad_norm: 3.5254
2023-02-19 07:37:10,145 - mmseg - INFO - Iter [44800/160000]	lr: 4.320e-05, eta: 9:08:34, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2144, decode.acc_seg: 91.3405, aux.loss_ce: 0.1115, aux.acc_seg: 88.9025, loss: 0.3258, grad_norm: 3.6038
2023-02-19 07:37:24,074 - mmseg - INFO - Iter [44850/160000]	lr: 4.318e-05, eta: 9:08:19, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2136, decode.acc_seg: 91.4459, aux.loss_ce: 0.1150, aux.acc_seg: 88.5210, loss: 0.3287, grad_norm: 2.8122
2023-02-19 07:37:37,757 - mmseg - INFO - Iter [44900/160000]	lr: 4.316e-05, eta: 9:08:03, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1978, decode.acc_seg: 92.0431, aux.loss_ce: 0.1057, aux.acc_seg: 89.4542, loss: 0.3035, grad_norm: 2.8070
2023-02-19 07:37:51,335 - mmseg - INFO - Iter [44950/160000]	lr: 4.314e-05, eta: 9:07:47, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1999, decode.acc_seg: 92.0565, aux.loss_ce: 0.1047, aux.acc_seg: 89.6167, loss: 0.3046, grad_norm: 2.5426
2023-02-19 07:38:05,243 - mmseg - INFO - Saving checkpoint at 45000 iterations
2023-02-19 07:38:08,496 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:38:08,496 - mmseg - INFO - Iter [45000/160000]	lr: 4.313e-05, eta: 9:07:40, time: 0.343, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2080, decode.acc_seg: 91.8197, aux.loss_ce: 0.1094, aux.acc_seg: 89.2433, loss: 0.3174, grad_norm: 2.7908
2023-02-19 07:38:22,298 - mmseg - INFO - Iter [45050/160000]	lr: 4.311e-05, eta: 9:07:24, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2066, decode.acc_seg: 91.8449, aux.loss_ce: 0.1094, aux.acc_seg: 89.2443, loss: 0.3160, grad_norm: 2.7495
2023-02-19 07:38:36,700 - mmseg - INFO - Iter [45100/160000]	lr: 4.309e-05, eta: 9:07:10, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2146, decode.acc_seg: 91.3971, aux.loss_ce: 0.1149, aux.acc_seg: 88.4364, loss: 0.3295, grad_norm: 3.5185
2023-02-19 07:38:50,454 - mmseg - INFO - Iter [45150/160000]	lr: 4.307e-05, eta: 9:06:55, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2062, decode.acc_seg: 91.6794, aux.loss_ce: 0.1060, aux.acc_seg: 89.4185, loss: 0.3122, grad_norm: 3.6803
2023-02-19 07:39:04,086 - mmseg - INFO - Iter [45200/160000]	lr: 4.305e-05, eta: 9:06:39, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2100, decode.acc_seg: 91.5308, aux.loss_ce: 0.1115, aux.acc_seg: 88.6802, loss: 0.3215, grad_norm: 3.6934
2023-02-19 07:39:17,747 - mmseg - INFO - Iter [45250/160000]	lr: 4.303e-05, eta: 9:06:23, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2066, decode.acc_seg: 91.5557, aux.loss_ce: 0.1078, aux.acc_seg: 89.2153, loss: 0.3144, grad_norm: 2.6821
2023-02-19 07:39:31,560 - mmseg - INFO - Iter [45300/160000]	lr: 4.301e-05, eta: 9:06:07, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1974, decode.acc_seg: 91.8905, aux.loss_ce: 0.1058, aux.acc_seg: 89.1783, loss: 0.3032, grad_norm: 3.1105
2023-02-19 07:39:47,448 - mmseg - INFO - Iter [45350/160000]	lr: 4.299e-05, eta: 9:05:57, time: 0.318, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2000, decode.acc_seg: 92.0324, aux.loss_ce: 0.1057, aux.acc_seg: 89.5688, loss: 0.3057, grad_norm: 2.6921
2023-02-19 07:40:01,197 - mmseg - INFO - Iter [45400/160000]	lr: 4.298e-05, eta: 9:05:41, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2073, decode.acc_seg: 91.5143, aux.loss_ce: 0.1098, aux.acc_seg: 88.7623, loss: 0.3170, grad_norm: 2.6490
2023-02-19 07:40:15,794 - mmseg - INFO - Iter [45450/160000]	lr: 4.296e-05, eta: 9:05:28, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2134, decode.acc_seg: 91.6109, aux.loss_ce: 0.1120, aux.acc_seg: 88.8706, loss: 0.3254, grad_norm: 3.3450
2023-02-19 07:40:32,236 - mmseg - INFO - Iter [45500/160000]	lr: 4.294e-05, eta: 9:05:19, time: 0.329, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1985, decode.acc_seg: 91.9210, aux.loss_ce: 0.1080, aux.acc_seg: 89.0787, loss: 0.3065, grad_norm: 2.5533
2023-02-19 07:40:45,989 - mmseg - INFO - Iter [45550/160000]	lr: 4.292e-05, eta: 9:05:03, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2047, decode.acc_seg: 92.0541, aux.loss_ce: 0.1100, aux.acc_seg: 89.2285, loss: 0.3147, grad_norm: 2.5166
2023-02-19 07:41:00,375 - mmseg - INFO - Iter [45600/160000]	lr: 4.290e-05, eta: 9:04:49, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1959, decode.acc_seg: 92.0103, aux.loss_ce: 0.1030, aux.acc_seg: 89.5928, loss: 0.2989, grad_norm: 2.7827
2023-02-19 07:41:14,453 - mmseg - INFO - Iter [45650/160000]	lr: 4.288e-05, eta: 9:04:35, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1943, decode.acc_seg: 92.1971, aux.loss_ce: 0.1020, aux.acc_seg: 89.7820, loss: 0.2964, grad_norm: 2.3329
2023-02-19 07:41:28,123 - mmseg - INFO - Iter [45700/160000]	lr: 4.286e-05, eta: 9:04:19, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1993, decode.acc_seg: 91.7890, aux.loss_ce: 0.1076, aux.acc_seg: 89.2266, loss: 0.3069, grad_norm: 3.0870
2023-02-19 07:41:42,088 - mmseg - INFO - Iter [45750/160000]	lr: 4.284e-05, eta: 9:04:04, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2041, decode.acc_seg: 91.5991, aux.loss_ce: 0.1074, aux.acc_seg: 89.0022, loss: 0.3115, grad_norm: 3.8080
2023-02-19 07:41:55,821 - mmseg - INFO - Iter [45800/160000]	lr: 4.283e-05, eta: 9:03:48, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1947, decode.acc_seg: 92.0018, aux.loss_ce: 0.1030, aux.acc_seg: 89.5308, loss: 0.2976, grad_norm: 2.9983
2023-02-19 07:42:09,769 - mmseg - INFO - Iter [45850/160000]	lr: 4.281e-05, eta: 9:03:33, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1994, decode.acc_seg: 91.8358, aux.loss_ce: 0.1033, aux.acc_seg: 89.4751, loss: 0.3027, grad_norm: 2.6097
2023-02-19 07:42:23,639 - mmseg - INFO - Iter [45900/160000]	lr: 4.279e-05, eta: 9:03:18, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2057, decode.acc_seg: 91.7121, aux.loss_ce: 0.1090, aux.acc_seg: 89.0921, loss: 0.3146, grad_norm: 2.6762
2023-02-19 07:42:37,456 - mmseg - INFO - Iter [45950/160000]	lr: 4.277e-05, eta: 9:03:02, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1974, decode.acc_seg: 92.1433, aux.loss_ce: 0.1089, aux.acc_seg: 89.2137, loss: 0.3063, grad_norm: 2.5836
2023-02-19 07:42:51,116 - mmseg - INFO - Saving checkpoint at 46000 iterations
2023-02-19 07:42:54,329 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:42:54,330 - mmseg - INFO - Iter [46000/160000]	lr: 4.275e-05, eta: 9:02:54, time: 0.338, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1981, decode.acc_seg: 92.2219, aux.loss_ce: 0.1076, aux.acc_seg: 89.4260, loss: 0.3058, grad_norm: 2.5981
2023-02-19 07:43:08,050 - mmseg - INFO - Iter [46050/160000]	lr: 4.273e-05, eta: 9:02:39, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2012, decode.acc_seg: 91.9120, aux.loss_ce: 0.1087, aux.acc_seg: 89.0722, loss: 0.3099, grad_norm: 2.7830
2023-02-19 07:43:21,907 - mmseg - INFO - Iter [46100/160000]	lr: 4.271e-05, eta: 9:02:23, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1885, decode.acc_seg: 92.0965, aux.loss_ce: 0.1046, aux.acc_seg: 89.2726, loss: 0.2931, grad_norm: 2.9056
2023-02-19 07:43:35,679 - mmseg - INFO - Iter [46150/160000]	lr: 4.269e-05, eta: 9:02:08, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2146, decode.acc_seg: 91.5024, aux.loss_ce: 0.1173, aux.acc_seg: 88.3613, loss: 0.3320, grad_norm: 3.4121
2023-02-19 07:43:49,250 - mmseg - INFO - Iter [46200/160000]	lr: 4.268e-05, eta: 9:01:52, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2015, decode.acc_seg: 91.7572, aux.loss_ce: 0.1069, aux.acc_seg: 89.3184, loss: 0.3084, grad_norm: 2.4712
2023-02-19 07:44:03,924 - mmseg - INFO - Iter [46250/160000]	lr: 4.266e-05, eta: 9:01:38, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2051, decode.acc_seg: 91.7138, aux.loss_ce: 0.1082, aux.acc_seg: 89.1076, loss: 0.3133, grad_norm: 3.0787
2023-02-19 07:44:17,662 - mmseg - INFO - Iter [46300/160000]	lr: 4.264e-05, eta: 9:01:23, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1954, decode.acc_seg: 92.1248, aux.loss_ce: 0.1026, aux.acc_seg: 89.7889, loss: 0.2980, grad_norm: 2.9621
2023-02-19 07:44:31,305 - mmseg - INFO - Iter [46350/160000]	lr: 4.262e-05, eta: 9:01:07, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2004, decode.acc_seg: 91.9413, aux.loss_ce: 0.1071, aux.acc_seg: 89.2665, loss: 0.3076, grad_norm: 3.6080
2023-02-19 07:44:45,039 - mmseg - INFO - Iter [46400/160000]	lr: 4.260e-05, eta: 9:00:51, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2067, decode.acc_seg: 91.5446, aux.loss_ce: 0.1061, aux.acc_seg: 89.1617, loss: 0.3128, grad_norm: 2.8335
2023-02-19 07:44:59,413 - mmseg - INFO - Iter [46450/160000]	lr: 4.258e-05, eta: 9:00:37, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1964, decode.acc_seg: 91.8936, aux.loss_ce: 0.1034, aux.acc_seg: 89.3786, loss: 0.2998, grad_norm: 2.9132
2023-02-19 07:45:13,246 - mmseg - INFO - Iter [46500/160000]	lr: 4.256e-05, eta: 9:00:22, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1949, decode.acc_seg: 92.0609, aux.loss_ce: 0.1034, aux.acc_seg: 89.5763, loss: 0.2983, grad_norm: 2.0775
2023-02-19 07:45:26,849 - mmseg - INFO - Iter [46550/160000]	lr: 4.254e-05, eta: 9:00:06, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2110, decode.acc_seg: 91.5386, aux.loss_ce: 0.1142, aux.acc_seg: 88.6348, loss: 0.3252, grad_norm: 2.6331
2023-02-19 07:45:40,642 - mmseg - INFO - Iter [46600/160000]	lr: 4.253e-05, eta: 8:59:50, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1973, decode.acc_seg: 92.2152, aux.loss_ce: 0.1039, aux.acc_seg: 89.7353, loss: 0.3012, grad_norm: 2.5695
2023-02-19 07:45:54,272 - mmseg - INFO - Iter [46650/160000]	lr: 4.251e-05, eta: 8:59:34, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1903, decode.acc_seg: 92.2827, aux.loss_ce: 0.1004, aux.acc_seg: 89.8069, loss: 0.2907, grad_norm: 2.7161
2023-02-19 07:46:08,823 - mmseg - INFO - Iter [46700/160000]	lr: 4.249e-05, eta: 8:59:21, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2054, decode.acc_seg: 91.7625, aux.loss_ce: 0.1098, aux.acc_seg: 88.9395, loss: 0.3152, grad_norm: 2.6683
2023-02-19 07:46:24,560 - mmseg - INFO - Iter [46750/160000]	lr: 4.247e-05, eta: 8:59:10, time: 0.315, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1962, decode.acc_seg: 92.3224, aux.loss_ce: 0.1063, aux.acc_seg: 89.3795, loss: 0.3024, grad_norm: 2.6877
2023-02-19 07:46:38,725 - mmseg - INFO - Iter [46800/160000]	lr: 4.245e-05, eta: 8:58:55, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1900, decode.acc_seg: 92.2051, aux.loss_ce: 0.1016, aux.acc_seg: 89.5361, loss: 0.2916, grad_norm: 2.5658
2023-02-19 07:46:53,315 - mmseg - INFO - Iter [46850/160000]	lr: 4.243e-05, eta: 8:58:42, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1945, decode.acc_seg: 92.1196, aux.loss_ce: 0.1042, aux.acc_seg: 89.4741, loss: 0.2986, grad_norm: 3.3458
2023-02-19 07:47:07,357 - mmseg - INFO - Iter [46900/160000]	lr: 4.241e-05, eta: 8:58:27, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1917, decode.acc_seg: 92.2661, aux.loss_ce: 0.1021, aux.acc_seg: 89.8279, loss: 0.2938, grad_norm: 2.7352
2023-02-19 07:47:21,162 - mmseg - INFO - Iter [46950/160000]	lr: 4.239e-05, eta: 8:58:12, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1909, decode.acc_seg: 92.3449, aux.loss_ce: 0.1041, aux.acc_seg: 89.4977, loss: 0.2950, grad_norm: 3.3126
2023-02-19 07:47:35,396 - mmseg - INFO - Saving checkpoint at 47000 iterations
2023-02-19 07:47:38,618 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:47:38,618 - mmseg - INFO - Iter [47000/160000]	lr: 4.238e-05, eta: 8:58:05, time: 0.349, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2012, decode.acc_seg: 91.8412, aux.loss_ce: 0.1079, aux.acc_seg: 88.9549, loss: 0.3091, grad_norm: 3.3987
2023-02-19 07:47:52,707 - mmseg - INFO - Iter [47050/160000]	lr: 4.236e-05, eta: 8:57:50, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1918, decode.acc_seg: 91.9576, aux.loss_ce: 0.1037, aux.acc_seg: 89.1879, loss: 0.2955, grad_norm: 3.3391
2023-02-19 07:48:06,885 - mmseg - INFO - Iter [47100/160000]	lr: 4.234e-05, eta: 8:57:36, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1968, decode.acc_seg: 92.0971, aux.loss_ce: 0.1079, aux.acc_seg: 89.2444, loss: 0.3048, grad_norm: 2.9784
2023-02-19 07:48:20,605 - mmseg - INFO - Iter [47150/160000]	lr: 4.232e-05, eta: 8:57:20, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1919, decode.acc_seg: 92.0555, aux.loss_ce: 0.1053, aux.acc_seg: 89.2843, loss: 0.2972, grad_norm: 2.6394
2023-02-19 07:48:34,305 - mmseg - INFO - Iter [47200/160000]	lr: 4.230e-05, eta: 8:57:04, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1972, decode.acc_seg: 91.9171, aux.loss_ce: 0.1041, aux.acc_seg: 89.4584, loss: 0.3013, grad_norm: 3.1000
2023-02-19 07:48:48,305 - mmseg - INFO - Iter [47250/160000]	lr: 4.228e-05, eta: 8:56:49, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1979, decode.acc_seg: 91.8775, aux.loss_ce: 0.1061, aux.acc_seg: 89.2325, loss: 0.3039, grad_norm: 2.7147
2023-02-19 07:49:02,216 - mmseg - INFO - Iter [47300/160000]	lr: 4.226e-05, eta: 8:56:34, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2001, decode.acc_seg: 91.9934, aux.loss_ce: 0.1043, aux.acc_seg: 89.5925, loss: 0.3044, grad_norm: 2.1957
2023-02-19 07:49:16,159 - mmseg - INFO - Iter [47350/160000]	lr: 4.224e-05, eta: 8:56:19, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1841, decode.acc_seg: 92.6989, aux.loss_ce: 0.1008, aux.acc_seg: 90.0404, loss: 0.2849, grad_norm: 2.3957
2023-02-19 07:49:30,220 - mmseg - INFO - Iter [47400/160000]	lr: 4.223e-05, eta: 8:56:04, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2010, decode.acc_seg: 91.7160, aux.loss_ce: 0.1060, aux.acc_seg: 89.2479, loss: 0.3071, grad_norm: 3.3394
2023-02-19 07:49:43,961 - mmseg - INFO - Iter [47450/160000]	lr: 4.221e-05, eta: 8:55:49, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2033, decode.acc_seg: 91.7050, aux.loss_ce: 0.1089, aux.acc_seg: 88.9536, loss: 0.3122, grad_norm: 2.9534
2023-02-19 07:49:57,672 - mmseg - INFO - Iter [47500/160000]	lr: 4.219e-05, eta: 8:55:33, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2098, decode.acc_seg: 91.7786, aux.loss_ce: 0.1137, aux.acc_seg: 89.0010, loss: 0.3235, grad_norm: 3.9092
2023-02-19 07:50:11,807 - mmseg - INFO - Iter [47550/160000]	lr: 4.217e-05, eta: 8:55:18, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2028, decode.acc_seg: 91.8805, aux.loss_ce: 0.1103, aux.acc_seg: 88.9510, loss: 0.3131, grad_norm: 2.5180
2023-02-19 07:50:26,919 - mmseg - INFO - Iter [47600/160000]	lr: 4.215e-05, eta: 8:55:06, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2021, decode.acc_seg: 91.7540, aux.loss_ce: 0.1072, aux.acc_seg: 89.2005, loss: 0.3093, grad_norm: 2.8082
2023-02-19 07:50:40,528 - mmseg - INFO - Iter [47650/160000]	lr: 4.213e-05, eta: 8:54:50, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2015, decode.acc_seg: 91.8344, aux.loss_ce: 0.1072, aux.acc_seg: 88.9900, loss: 0.3087, grad_norm: 2.6651
2023-02-19 07:50:54,783 - mmseg - INFO - Iter [47700/160000]	lr: 4.211e-05, eta: 8:54:36, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2162, decode.acc_seg: 91.4458, aux.loss_ce: 0.1134, aux.acc_seg: 88.7519, loss: 0.3296, grad_norm: 3.3824
2023-02-19 07:51:09,289 - mmseg - INFO - Iter [47750/160000]	lr: 4.209e-05, eta: 8:54:22, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1996, decode.acc_seg: 91.8440, aux.loss_ce: 0.1063, aux.acc_seg: 89.3660, loss: 0.3060, grad_norm: 2.7118
2023-02-19 07:51:23,021 - mmseg - INFO - Iter [47800/160000]	lr: 4.208e-05, eta: 8:54:07, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2014, decode.acc_seg: 92.1537, aux.loss_ce: 0.1089, aux.acc_seg: 89.3113, loss: 0.3104, grad_norm: 2.9668
2023-02-19 07:51:36,848 - mmseg - INFO - Iter [47850/160000]	lr: 4.206e-05, eta: 8:53:51, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2000, decode.acc_seg: 92.0274, aux.loss_ce: 0.1087, aux.acc_seg: 89.0999, loss: 0.3087, grad_norm: 2.9078
2023-02-19 07:51:50,932 - mmseg - INFO - Iter [47900/160000]	lr: 4.204e-05, eta: 8:53:37, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1946, decode.acc_seg: 91.9324, aux.loss_ce: 0.1059, aux.acc_seg: 89.2121, loss: 0.3006, grad_norm: 2.9064
2023-02-19 07:52:05,221 - mmseg - INFO - Iter [47950/160000]	lr: 4.202e-05, eta: 8:53:22, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1855, decode.acc_seg: 92.4943, aux.loss_ce: 0.1022, aux.acc_seg: 89.7901, loss: 0.2878, grad_norm: 2.5768
2023-02-19 07:52:22,449 - mmseg - INFO - Saving checkpoint at 48000 iterations
2023-02-19 07:52:25,700 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:52:25,700 - mmseg - INFO - Iter [48000/160000]	lr: 4.200e-05, eta: 8:53:22, time: 0.410, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1980, decode.acc_seg: 91.9145, aux.loss_ce: 0.1033, aux.acc_seg: 89.4732, loss: 0.3013, grad_norm: 2.3537
2023-02-19 07:52:40,931 - mmseg - INFO - per class results:
2023-02-19 07:52:40,937 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        | 74.66 | 78.81 |
|       building      | 82.21 | 90.07 |
|         sky         | 92.29 |  98.9 |
|        floor        | 80.25 | 92.16 |
|         tree        | 73.28 | 83.23 |
|       ceiling       | 81.97 | 96.39 |
|         road        | 84.44 |  91.3 |
|         bed         | 89.92 | 96.63 |
|      windowpane     | 60.83 | 79.91 |
|        grass        | 64.05 |  79.3 |
|       cabinet       | 59.61 | 72.73 |
|       sidewalk      | 70.68 | 82.68 |
|        person       | 80.29 | 94.25 |
|        earth        | 32.27 | 44.11 |
|         door        | 50.73 | 74.09 |
|        table        |  62.9 | 75.08 |
|       mountain      | 60.29 | 72.15 |
|        plant        | 49.81 | 60.05 |
|       curtain       | 74.11 | 91.09 |
|        chair        |  59.1 | 67.32 |
|         car         | 84.98 |  91.9 |
|        water        | 54.72 | 70.97 |
|       painting      | 69.35 | 92.91 |
|         sofa        | 69.49 | 85.97 |
|        shelf        | 40.97 | 74.54 |
|        house        |  43.8 | 64.68 |
|         sea         | 62.67 | 84.45 |
|        mirror       | 68.57 | 86.04 |
|         rug         | 63.86 | 77.05 |
|        field        | 29.56 | 58.43 |
|       armchair      | 49.07 | 69.57 |
|         seat        | 60.26 | 84.94 |
|        fence        | 47.42 | 68.29 |
|         desk        |  50.4 | 60.86 |
|         rock        | 51.52 | 79.53 |
|       wardrobe      | 47.13 | 74.01 |
|         lamp        |  62.1 | 82.57 |
|       bathtub       | 77.04 | 86.77 |
|       railing       |  39.3 | 60.29 |
|       cushion       | 62.11 | 79.51 |
|         base        |  38.2 |  54.9 |
|         box         | 28.51 | 34.27 |
|        column       | 46.67 | 61.27 |
|      signboard      | 35.12 | 53.18 |
|   chest of drawers  | 45.46 | 60.55 |
|       counter       | 29.74 | 41.11 |
|         sand        | 42.85 | 61.98 |
|         sink        |  71.4 | 77.85 |
|      skyscraper     |  51.2 |  66.7 |
|      fireplace      | 54.94 | 96.83 |
|     refrigerator    | 75.37 | 81.97 |
|      grandstand     | 49.67 | 68.85 |
|         path        | 22.91 | 32.65 |
|        stairs       | 35.07 | 44.38 |
|        runway       | 66.78 | 91.36 |
|         case        | 51.89 | 62.21 |
|      pool table     | 92.98 |  95.3 |
|        pillow       | 56.57 | 64.16 |
|     screen door     | 76.38 | 80.96 |
|       stairway      | 42.35 | 48.77 |
|        river        | 14.94 | 33.97 |
|        bridge       | 66.84 | 83.85 |
|       bookcase      | 49.07 | 71.59 |
|        blind        | 50.79 | 67.03 |
|     coffee table    | 60.23 |  76.1 |
|        toilet       | 83.19 | 93.26 |
|        flower       | 41.94 | 68.01 |
|         book        | 31.05 | 39.61 |
|         hill        | 12.57 | 24.16 |
|        bench        | 48.09 | 64.67 |
|      countertop     | 51.95 | 84.02 |
|        stove        | 79.59 | 88.25 |
|         palm        | 50.78 | 84.75 |
|    kitchen island   |  42.6 | 73.79 |
|       computer      | 75.58 | 92.26 |
|     swivel chair    | 44.97 |  68.3 |
|         boat        | 52.39 | 60.84 |
|         bar         | 35.63 | 55.42 |
|    arcade machine   | 54.63 | 59.07 |
|        hovel        | 41.49 | 64.38 |
|         bus         | 83.99 | 97.43 |
|        towel        | 64.65 | 86.83 |
|        light        | 46.99 |  49.6 |
|        truck        | 37.82 | 55.15 |
|        tower        | 30.74 | 57.74 |
|      chandelier     | 66.49 | 83.38 |
|        awning       | 30.93 | 38.68 |
|     streetlight     |  28.1 | 38.13 |
|        booth        |  37.7 | 54.15 |
| television receiver | 67.52 |  87.6 |
|       airplane      | 57.75 |  62.9 |
|      dirt track     |  4.93 | 22.43 |
|       apparel       | 40.75 | 83.33 |
|         pole        | 25.18 | 50.86 |
|         land        |  4.3  | 13.19 |
|      bannister      |  8.66 | 11.13 |
|      escalator      | 37.44 | 53.41 |
|       ottoman       | 49.65 | 68.07 |
|        bottle       | 37.18 | 67.73 |
|        buffet       | 49.99 |  70.6 |
|        poster       | 27.71 | 44.73 |
|        stage        | 16.28 |  23.6 |
|         van         | 38.77 | 50.67 |
|         ship        | 57.74 | 83.12 |
|       fountain      | 21.26 | 22.04 |
|    conveyer belt    | 80.86 | 91.38 |
|        canopy       | 42.26 | 54.11 |
|        washer       | 76.43 | 81.34 |
|      plaything      | 25.94 | 34.21 |
|    swimming pool    | 53.26 | 72.04 |
|        stool        | 36.47 | 58.96 |
|        barrel       | 23.36 | 73.41 |
|        basket       | 34.63 | 55.85 |
|      waterfall      | 46.79 | 50.79 |
|         tent        |  93.6 | 98.45 |
|         bag         | 16.23 | 18.34 |
|       minibike      | 65.08 | 76.27 |
|        cradle       | 86.35 | 94.88 |
|         oven        | 49.25 | 55.58 |
|         ball        | 57.15 | 76.04 |
|         food        | 59.01 | 72.07 |
|         step        |  7.36 |  8.74 |
|         tank        | 50.03 | 55.61 |
|      trade name     | 18.12 |  21.4 |
|      microwave      |  80.9 | 92.99 |
|         pot         | 41.99 | 48.52 |
|        animal       | 65.73 | 69.74 |
|       bicycle       | 50.36 | 77.14 |
|         lake        | 38.95 | 39.05 |
|      dishwasher     | 67.76 |  87.8 |
|        screen       | 56.33 | 88.23 |
|       blanket       | 26.07 | 35.97 |
|      sculpture      | 61.74 | 85.12 |
|         hood        |  63.0 | 77.39 |
|        sconce       | 39.34 | 62.03 |
|         vase        | 34.92 |  63.5 |
|    traffic light    | 36.51 | 47.86 |
|         tray        |  8.48 | 15.28 |
|        ashcan       | 47.04 | 61.37 |
|         fan         | 63.06 | 76.05 |
|         pier        |  26.1 | 73.73 |
|      crt screen     |  3.41 |  9.9  |
|        plate        |  57.6 | 74.29 |
|       monitor       |  6.99 |  8.99 |
|    bulletin board   | 41.95 | 45.51 |
|        shower       |  0.25 |  0.26 |
|       radiator      | 67.81 | 88.28 |
|        glass        | 14.58 | 16.84 |
|        clock        | 39.46 |  54.0 |
|         flag        | 68.41 | 78.92 |
+---------------------+-------+-------+
2023-02-19 07:52:40,938 - mmseg - INFO - Summary:
2023-02-19 07:52:40,938 - mmseg - INFO - 
+-------+-------+-------+
|  aAcc |  mIoU |  mAcc |
+-------+-------+-------+
| 82.41 | 50.16 | 65.27 |
+-------+-------+-------+
2023-02-19 07:52:44,126 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_48000.pth.
2023-02-19 07:52:44,126 - mmseg - INFO - Best mIoU is 0.5016 at 48000 iter.
2023-02-19 07:52:44,127 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:52:44,127 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8241, mIoU: 0.5016, mAcc: 0.6527, IoU.wall: 0.7466, IoU.building: 0.8221, IoU.sky: 0.9229, IoU.floor: 0.8025, IoU.tree: 0.7328, IoU.ceiling: 0.8197, IoU.road: 0.8444, IoU.bed : 0.8992, IoU.windowpane: 0.6083, IoU.grass: 0.6405, IoU.cabinet: 0.5961, IoU.sidewalk: 0.7068, IoU.person: 0.8029, IoU.earth: 0.3227, IoU.door: 0.5073, IoU.table: 0.6290, IoU.mountain: 0.6029, IoU.plant: 0.4981, IoU.curtain: 0.7411, IoU.chair: 0.5910, IoU.car: 0.8498, IoU.water: 0.5472, IoU.painting: 0.6935, IoU.sofa: 0.6949, IoU.shelf: 0.4097, IoU.house: 0.4380, IoU.sea: 0.6267, IoU.mirror: 0.6857, IoU.rug: 0.6386, IoU.field: 0.2956, IoU.armchair: 0.4907, IoU.seat: 0.6026, IoU.fence: 0.4742, IoU.desk: 0.5040, IoU.rock: 0.5152, IoU.wardrobe: 0.4713, IoU.lamp: 0.6210, IoU.bathtub: 0.7704, IoU.railing: 0.3930, IoU.cushion: 0.6211, IoU.base: 0.3820, IoU.box: 0.2851, IoU.column: 0.4667, IoU.signboard: 0.3512, IoU.chest of drawers: 0.4546, IoU.counter: 0.2974, IoU.sand: 0.4285, IoU.sink: 0.7140, IoU.skyscraper: 0.5120, IoU.fireplace: 0.5494, IoU.refrigerator: 0.7537, IoU.grandstand: 0.4967, IoU.path: 0.2291, IoU.stairs: 0.3507, IoU.runway: 0.6678, IoU.case: 0.5189, IoU.pool table: 0.9298, IoU.pillow: 0.5657, IoU.screen door: 0.7638, IoU.stairway: 0.4235, IoU.river: 0.1494, IoU.bridge: 0.6684, IoU.bookcase: 0.4907, IoU.blind: 0.5079, IoU.coffee table: 0.6023, IoU.toilet: 0.8319, IoU.flower: 0.4194, IoU.book: 0.3105, IoU.hill: 0.1257, IoU.bench: 0.4809, IoU.countertop: 0.5195, IoU.stove: 0.7959, IoU.palm: 0.5078, IoU.kitchen island: 0.4260, IoU.computer: 0.7558, IoU.swivel chair: 0.4497, IoU.boat: 0.5239, IoU.bar: 0.3563, IoU.arcade machine: 0.5463, IoU.hovel: 0.4149, IoU.bus: 0.8399, IoU.towel: 0.6465, IoU.light: 0.4699, IoU.truck: 0.3782, IoU.tower: 0.3074, IoU.chandelier: 0.6649, IoU.awning: 0.3093, IoU.streetlight: 0.2810, IoU.booth: 0.3770, IoU.television receiver: 0.6752, IoU.airplane: 0.5775, IoU.dirt track: 0.0493, IoU.apparel: 0.4075, IoU.pole: 0.2518, IoU.land: 0.0430, IoU.bannister: 0.0866, IoU.escalator: 0.3744, IoU.ottoman: 0.4965, IoU.bottle: 0.3718, IoU.buffet: 0.4999, IoU.poster: 0.2771, IoU.stage: 0.1628, IoU.van: 0.3877, IoU.ship: 0.5774, IoU.fountain: 0.2126, IoU.conveyer belt: 0.8086, IoU.canopy: 0.4226, IoU.washer: 0.7643, IoU.plaything: 0.2594, IoU.swimming pool: 0.5326, IoU.stool: 0.3647, IoU.barrel: 0.2336, IoU.basket: 0.3463, IoU.waterfall: 0.4679, IoU.tent: 0.9360, IoU.bag: 0.1623, IoU.minibike: 0.6508, IoU.cradle: 0.8635, IoU.oven: 0.4925, IoU.ball: 0.5715, IoU.food: 0.5901, IoU.step: 0.0736, IoU.tank: 0.5003, IoU.trade name: 0.1812, IoU.microwave: 0.8090, IoU.pot: 0.4199, IoU.animal: 0.6573, IoU.bicycle: 0.5036, IoU.lake: 0.3895, IoU.dishwasher: 0.6776, IoU.screen: 0.5633, IoU.blanket: 0.2607, IoU.sculpture: 0.6174, IoU.hood: 0.6300, IoU.sconce: 0.3934, IoU.vase: 0.3492, IoU.traffic light: 0.3651, IoU.tray: 0.0848, IoU.ashcan: 0.4704, IoU.fan: 0.6306, IoU.pier: 0.2610, IoU.crt screen: 0.0341, IoU.plate: 0.5760, IoU.monitor: 0.0699, IoU.bulletin board: 0.4195, IoU.shower: 0.0025, IoU.radiator: 0.6781, IoU.glass: 0.1458, IoU.clock: 0.3946, IoU.flag: 0.6841, Acc.wall: 0.7881, Acc.building: 0.9007, Acc.sky: 0.9890, Acc.floor: 0.9216, Acc.tree: 0.8323, Acc.ceiling: 0.9639, Acc.road: 0.9130, Acc.bed : 0.9663, Acc.windowpane: 0.7991, Acc.grass: 0.7930, Acc.cabinet: 0.7273, Acc.sidewalk: 0.8268, Acc.person: 0.9425, Acc.earth: 0.4411, Acc.door: 0.7409, Acc.table: 0.7508, Acc.mountain: 0.7215, Acc.plant: 0.6005, Acc.curtain: 0.9109, Acc.chair: 0.6732, Acc.car: 0.9190, Acc.water: 0.7097, Acc.painting: 0.9291, Acc.sofa: 0.8597, Acc.shelf: 0.7454, Acc.house: 0.6468, Acc.sea: 0.8445, Acc.mirror: 0.8604, Acc.rug: 0.7705, Acc.field: 0.5843, Acc.armchair: 0.6957, Acc.seat: 0.8494, Acc.fence: 0.6829, Acc.desk: 0.6086, Acc.rock: 0.7953, Acc.wardrobe: 0.7401, Acc.lamp: 0.8257, Acc.bathtub: 0.8677, Acc.railing: 0.6029, Acc.cushion: 0.7951, Acc.base: 0.5490, Acc.box: 0.3427, Acc.column: 0.6127, Acc.signboard: 0.5318, Acc.chest of drawers: 0.6055, Acc.counter: 0.4111, Acc.sand: 0.6198, Acc.sink: 0.7785, Acc.skyscraper: 0.6670, Acc.fireplace: 0.9683, Acc.refrigerator: 0.8197, Acc.grandstand: 0.6885, Acc.path: 0.3265, Acc.stairs: 0.4438, Acc.runway: 0.9136, Acc.case: 0.6221, Acc.pool table: 0.9530, Acc.pillow: 0.6416, Acc.screen door: 0.8096, Acc.stairway: 0.4877, Acc.river: 0.3397, Acc.bridge: 0.8385, Acc.bookcase: 0.7159, Acc.blind: 0.6703, Acc.coffee table: 0.7610, Acc.toilet: 0.9326, Acc.flower: 0.6801, Acc.book: 0.3961, Acc.hill: 0.2416, Acc.bench: 0.6467, Acc.countertop: 0.8402, Acc.stove: 0.8825, Acc.palm: 0.8475, Acc.kitchen island: 0.7379, Acc.computer: 0.9226, Acc.swivel chair: 0.6830, Acc.boat: 0.6084, Acc.bar: 0.5542, Acc.arcade machine: 0.5907, Acc.hovel: 0.6438, Acc.bus: 0.9743, Acc.towel: 0.8683, Acc.light: 0.4960, Acc.truck: 0.5515, Acc.tower: 0.5774, Acc.chandelier: 0.8338, Acc.awning: 0.3868, Acc.streetlight: 0.3813, Acc.booth: 0.5415, Acc.television receiver: 0.8760, Acc.airplane: 0.6290, Acc.dirt track: 0.2243, Acc.apparel: 0.8333, Acc.pole: 0.5086, Acc.land: 0.1319, Acc.bannister: 0.1113, Acc.escalator: 0.5341, Acc.ottoman: 0.6807, Acc.bottle: 0.6773, Acc.buffet: 0.7060, Acc.poster: 0.4473, Acc.stage: 0.2360, Acc.van: 0.5067, Acc.ship: 0.8312, Acc.fountain: 0.2204, Acc.conveyer belt: 0.9138, Acc.canopy: 0.5411, Acc.washer: 0.8134, Acc.plaything: 0.3421, Acc.swimming pool: 0.7204, Acc.stool: 0.5896, Acc.barrel: 0.7341, Acc.basket: 0.5585, Acc.waterfall: 0.5079, Acc.tent: 0.9845, Acc.bag: 0.1834, Acc.minibike: 0.7627, Acc.cradle: 0.9488, Acc.oven: 0.5558, Acc.ball: 0.7604, Acc.food: 0.7207, Acc.step: 0.0874, Acc.tank: 0.5561, Acc.trade name: 0.2140, Acc.microwave: 0.9299, Acc.pot: 0.4852, Acc.animal: 0.6974, Acc.bicycle: 0.7714, Acc.lake: 0.3905, Acc.dishwasher: 0.8780, Acc.screen: 0.8823, Acc.blanket: 0.3597, Acc.sculpture: 0.8512, Acc.hood: 0.7739, Acc.sconce: 0.6203, Acc.vase: 0.6350, Acc.traffic light: 0.4786, Acc.tray: 0.1528, Acc.ashcan: 0.6137, Acc.fan: 0.7605, Acc.pier: 0.7373, Acc.crt screen: 0.0990, Acc.plate: 0.7429, Acc.monitor: 0.0899, Acc.bulletin board: 0.4551, Acc.shower: 0.0026, Acc.radiator: 0.8828, Acc.glass: 0.1684, Acc.clock: 0.5400, Acc.flag: 0.7892
2023-02-19 07:52:58,165 - mmseg - INFO - Iter [48050/160000]	lr: 4.198e-05, eta: 8:53:51, time: 0.649, data_time: 0.373, memory: 15214, decode.loss_ce: 0.1950, decode.acc_seg: 91.9974, aux.loss_ce: 0.1034, aux.acc_seg: 89.2656, loss: 0.2984, grad_norm: 2.7481
2023-02-19 07:53:13,251 - mmseg - INFO - Iter [48100/160000]	lr: 4.196e-05, eta: 8:53:38, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1986, decode.acc_seg: 92.1431, aux.loss_ce: 0.1029, aux.acc_seg: 89.7931, loss: 0.3015, grad_norm: 2.2884
2023-02-19 07:53:26,902 - mmseg - INFO - Iter [48150/160000]	lr: 4.194e-05, eta: 8:53:22, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2053, decode.acc_seg: 91.9259, aux.loss_ce: 0.1097, aux.acc_seg: 89.1833, loss: 0.3150, grad_norm: 3.1198
2023-02-19 07:53:40,447 - mmseg - INFO - Iter [48200/160000]	lr: 4.193e-05, eta: 8:53:06, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1973, decode.acc_seg: 92.1752, aux.loss_ce: 0.1067, aux.acc_seg: 89.4680, loss: 0.3040, grad_norm: 2.7170
2023-02-19 07:53:54,127 - mmseg - INFO - Iter [48250/160000]	lr: 4.191e-05, eta: 8:52:50, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1874, decode.acc_seg: 92.1963, aux.loss_ce: 0.0991, aux.acc_seg: 89.6920, loss: 0.2864, grad_norm: 3.0595
2023-02-19 07:54:08,043 - mmseg - INFO - Iter [48300/160000]	lr: 4.189e-05, eta: 8:52:35, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1925, decode.acc_seg: 92.1998, aux.loss_ce: 0.1029, aux.acc_seg: 89.6245, loss: 0.2954, grad_norm: 2.5220
2023-02-19 07:54:22,089 - mmseg - INFO - Iter [48350/160000]	lr: 4.187e-05, eta: 8:52:20, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2013, decode.acc_seg: 91.9078, aux.loss_ce: 0.1062, aux.acc_seg: 89.4393, loss: 0.3075, grad_norm: 3.2300
2023-02-19 07:54:35,960 - mmseg - INFO - Iter [48400/160000]	lr: 4.185e-05, eta: 8:52:05, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1963, decode.acc_seg: 92.0172, aux.loss_ce: 0.1050, aux.acc_seg: 89.2257, loss: 0.3013, grad_norm: 2.5061
2023-02-19 07:54:50,567 - mmseg - INFO - Iter [48450/160000]	lr: 4.183e-05, eta: 8:51:51, time: 0.292, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1997, decode.acc_seg: 92.0283, aux.loss_ce: 0.1068, aux.acc_seg: 89.4088, loss: 0.3065, grad_norm: 3.1067
2023-02-19 07:55:04,788 - mmseg - INFO - Iter [48500/160000]	lr: 4.181e-05, eta: 8:51:37, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1916, decode.acc_seg: 92.2669, aux.loss_ce: 0.1022, aux.acc_seg: 89.6947, loss: 0.2938, grad_norm: 2.7483
2023-02-19 07:55:18,445 - mmseg - INFO - Iter [48550/160000]	lr: 4.179e-05, eta: 8:51:21, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1880, decode.acc_seg: 92.1778, aux.loss_ce: 0.1023, aux.acc_seg: 89.3607, loss: 0.2902, grad_norm: 2.3839
2023-02-19 07:55:32,513 - mmseg - INFO - Iter [48600/160000]	lr: 4.178e-05, eta: 8:51:06, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2053, decode.acc_seg: 91.6953, aux.loss_ce: 0.1072, aux.acc_seg: 89.1427, loss: 0.3125, grad_norm: 3.4563
2023-02-19 07:55:46,104 - mmseg - INFO - Iter [48650/160000]	lr: 4.176e-05, eta: 8:50:50, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1923, decode.acc_seg: 92.1980, aux.loss_ce: 0.1027, aux.acc_seg: 89.6337, loss: 0.2949, grad_norm: 2.5315
2023-02-19 07:56:00,456 - mmseg - INFO - Iter [48700/160000]	lr: 4.174e-05, eta: 8:50:36, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1763, decode.acc_seg: 92.7926, aux.loss_ce: 0.0975, aux.acc_seg: 90.1000, loss: 0.2738, grad_norm: 2.0292
2023-02-19 07:56:14,067 - mmseg - INFO - Iter [48750/160000]	lr: 4.172e-05, eta: 8:50:20, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1885, decode.acc_seg: 92.0503, aux.loss_ce: 0.1022, aux.acc_seg: 89.3577, loss: 0.2907, grad_norm: 2.4869
2023-02-19 07:56:27,894 - mmseg - INFO - Iter [48800/160000]	lr: 4.170e-05, eta: 8:50:05, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1873, decode.acc_seg: 92.3593, aux.loss_ce: 0.0999, aux.acc_seg: 89.9040, loss: 0.2871, grad_norm: 2.6663
2023-02-19 07:56:41,608 - mmseg - INFO - Iter [48850/160000]	lr: 4.168e-05, eta: 8:49:49, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2034, decode.acc_seg: 91.9343, aux.loss_ce: 0.1065, aux.acc_seg: 89.5576, loss: 0.3099, grad_norm: 3.1528
2023-02-19 07:56:55,626 - mmseg - INFO - Iter [48900/160000]	lr: 4.166e-05, eta: 8:49:34, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1953, decode.acc_seg: 92.0290, aux.loss_ce: 0.1060, aux.acc_seg: 89.2419, loss: 0.3013, grad_norm: 2.4722
2023-02-19 07:57:09,530 - mmseg - INFO - Iter [48950/160000]	lr: 4.164e-05, eta: 8:49:19, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1893, decode.acc_seg: 92.0854, aux.loss_ce: 0.1050, aux.acc_seg: 89.2991, loss: 0.2943, grad_norm: 2.7208
2023-02-19 07:57:23,219 - mmseg - INFO - Saving checkpoint at 49000 iterations
2023-02-19 07:57:26,437 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 07:57:26,437 - mmseg - INFO - Iter [49000/160000]	lr: 4.163e-05, eta: 8:49:11, time: 0.338, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2047, decode.acc_seg: 92.1676, aux.loss_ce: 0.1122, aux.acc_seg: 89.3405, loss: 0.3169, grad_norm: 3.1337
2023-02-19 07:57:40,144 - mmseg - INFO - Iter [49050/160000]	lr: 4.161e-05, eta: 8:48:55, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.2116, decode.acc_seg: 91.7061, aux.loss_ce: 0.1104, aux.acc_seg: 89.0155, loss: 0.3220, grad_norm: 2.9008
2023-02-19 07:57:54,060 - mmseg - INFO - Iter [49100/160000]	lr: 4.159e-05, eta: 8:48:40, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1944, decode.acc_seg: 92.1487, aux.loss_ce: 0.1068, aux.acc_seg: 89.3013, loss: 0.3012, grad_norm: 2.9242
2023-02-19 07:58:08,478 - mmseg - INFO - Iter [49150/160000]	lr: 4.157e-05, eta: 8:48:26, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1742, decode.acc_seg: 92.8980, aux.loss_ce: 0.0957, aux.acc_seg: 90.2270, loss: 0.2699, grad_norm: 2.5267
2023-02-19 07:58:23,115 - mmseg - INFO - Iter [49200/160000]	lr: 4.155e-05, eta: 8:48:12, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1924, decode.acc_seg: 92.4182, aux.loss_ce: 0.1063, aux.acc_seg: 89.5612, loss: 0.2987, grad_norm: 2.6770
2023-02-19 07:58:36,794 - mmseg - INFO - Iter [49250/160000]	lr: 4.153e-05, eta: 8:47:57, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2011, decode.acc_seg: 92.0017, aux.loss_ce: 0.1108, aux.acc_seg: 88.9713, loss: 0.3119, grad_norm: 2.9895
2023-02-19 07:58:52,809 - mmseg - INFO - Iter [49300/160000]	lr: 4.151e-05, eta: 8:47:46, time: 0.320, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1830, decode.acc_seg: 92.4948, aux.loss_ce: 0.0985, aux.acc_seg: 90.0151, loss: 0.2815, grad_norm: 2.4248
2023-02-19 07:59:07,293 - mmseg - INFO - Iter [49350/160000]	lr: 4.149e-05, eta: 8:47:32, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1913, decode.acc_seg: 92.3650, aux.loss_ce: 0.1027, aux.acc_seg: 89.6204, loss: 0.2941, grad_norm: 2.6103
2023-02-19 07:59:21,202 - mmseg - INFO - Iter [49400/160000]	lr: 4.148e-05, eta: 8:47:17, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1898, decode.acc_seg: 92.3981, aux.loss_ce: 0.1006, aux.acc_seg: 89.9768, loss: 0.2905, grad_norm: 2.9475
2023-02-19 07:59:34,896 - mmseg - INFO - Iter [49450/160000]	lr: 4.146e-05, eta: 8:47:01, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1891, decode.acc_seg: 92.3248, aux.loss_ce: 0.1002, aux.acc_seg: 89.7739, loss: 0.2892, grad_norm: 3.3921
2023-02-19 07:59:48,513 - mmseg - INFO - Iter [49500/160000]	lr: 4.144e-05, eta: 8:46:46, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1894, decode.acc_seg: 92.3418, aux.loss_ce: 0.1042, aux.acc_seg: 89.6303, loss: 0.2936, grad_norm: 2.6589
2023-02-19 08:00:03,041 - mmseg - INFO - Iter [49550/160000]	lr: 4.142e-05, eta: 8:46:32, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1884, decode.acc_seg: 92.3710, aux.loss_ce: 0.1055, aux.acc_seg: 89.4016, loss: 0.2939, grad_norm: 2.9249
2023-02-19 08:00:17,052 - mmseg - INFO - Iter [49600/160000]	lr: 4.140e-05, eta: 8:46:17, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1894, decode.acc_seg: 92.3264, aux.loss_ce: 0.1032, aux.acc_seg: 89.4755, loss: 0.2926, grad_norm: 2.6731
2023-02-19 08:00:30,879 - mmseg - INFO - Iter [49650/160000]	lr: 4.138e-05, eta: 8:46:01, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1819, decode.acc_seg: 92.6498, aux.loss_ce: 0.0981, aux.acc_seg: 90.1934, loss: 0.2800, grad_norm: 2.7482
2023-02-19 08:00:44,821 - mmseg - INFO - Iter [49700/160000]	lr: 4.136e-05, eta: 8:45:46, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1982, decode.acc_seg: 91.8639, aux.loss_ce: 0.1055, aux.acc_seg: 89.3442, loss: 0.3037, grad_norm: 2.6667
2023-02-19 08:00:58,691 - mmseg - INFO - Iter [49750/160000]	lr: 4.134e-05, eta: 8:45:31, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1939, decode.acc_seg: 92.1800, aux.loss_ce: 0.1061, aux.acc_seg: 89.4228, loss: 0.3000, grad_norm: 2.4378
2023-02-19 08:01:12,691 - mmseg - INFO - Iter [49800/160000]	lr: 4.133e-05, eta: 8:45:16, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1970, decode.acc_seg: 92.2730, aux.loss_ce: 0.1077, aux.acc_seg: 89.4882, loss: 0.3047, grad_norm: 2.8213
2023-02-19 08:01:27,092 - mmseg - INFO - Iter [49850/160000]	lr: 4.131e-05, eta: 8:45:02, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1882, decode.acc_seg: 92.2985, aux.loss_ce: 0.1026, aux.acc_seg: 89.6497, loss: 0.2908, grad_norm: 2.6076
2023-02-19 08:01:40,660 - mmseg - INFO - Iter [49900/160000]	lr: 4.129e-05, eta: 8:44:46, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1911, decode.acc_seg: 92.3213, aux.loss_ce: 0.1034, aux.acc_seg: 89.7087, loss: 0.2945, grad_norm: 3.9055
2023-02-19 08:01:54,246 - mmseg - INFO - Iter [49950/160000]	lr: 4.127e-05, eta: 8:44:30, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1975, decode.acc_seg: 91.9029, aux.loss_ce: 0.1026, aux.acc_seg: 89.5219, loss: 0.3001, grad_norm: 2.6637
2023-02-19 08:02:08,384 - mmseg - INFO - Saving checkpoint at 50000 iterations
2023-02-19 08:02:11,591 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:02:11,591 - mmseg - INFO - Iter [50000/160000]	lr: 4.125e-05, eta: 8:44:23, time: 0.347, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1928, decode.acc_seg: 92.3459, aux.loss_ce: 0.1071, aux.acc_seg: 89.2904, loss: 0.2999, grad_norm: 2.6977
2023-02-19 08:02:25,692 - mmseg - INFO - Iter [50050/160000]	lr: 4.123e-05, eta: 8:44:08, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1862, decode.acc_seg: 92.3900, aux.loss_ce: 0.1021, aux.acc_seg: 89.7893, loss: 0.2882, grad_norm: 2.8443
2023-02-19 08:02:40,011 - mmseg - INFO - Iter [50100/160000]	lr: 4.121e-05, eta: 8:43:54, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1862, decode.acc_seg: 92.6440, aux.loss_ce: 0.1016, aux.acc_seg: 89.8498, loss: 0.2878, grad_norm: 3.0983
2023-02-19 08:02:53,622 - mmseg - INFO - Iter [50150/160000]	lr: 4.119e-05, eta: 8:43:38, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1900, decode.acc_seg: 92.3603, aux.loss_ce: 0.1017, aux.acc_seg: 89.7671, loss: 0.2916, grad_norm: 2.6370
2023-02-19 08:03:07,619 - mmseg - INFO - Iter [50200/160000]	lr: 4.118e-05, eta: 8:43:23, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1951, decode.acc_seg: 92.2658, aux.loss_ce: 0.1062, aux.acc_seg: 89.4934, loss: 0.3013, grad_norm: 2.2905
2023-02-19 08:03:21,434 - mmseg - INFO - Iter [50250/160000]	lr: 4.116e-05, eta: 8:43:07, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1957, decode.acc_seg: 91.9820, aux.loss_ce: 0.1044, aux.acc_seg: 89.4214, loss: 0.3001, grad_norm: 2.5538
2023-02-19 08:03:35,319 - mmseg - INFO - Iter [50300/160000]	lr: 4.114e-05, eta: 8:42:52, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1905, decode.acc_seg: 92.4069, aux.loss_ce: 0.1052, aux.acc_seg: 89.4391, loss: 0.2957, grad_norm: 2.6371
2023-02-19 08:03:49,081 - mmseg - INFO - Iter [50350/160000]	lr: 4.112e-05, eta: 8:42:37, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1879, decode.acc_seg: 92.2589, aux.loss_ce: 0.1021, aux.acc_seg: 89.6238, loss: 0.2900, grad_norm: 2.7601
2023-02-19 08:04:03,454 - mmseg - INFO - Iter [50400/160000]	lr: 4.110e-05, eta: 8:42:23, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2175, decode.acc_seg: 91.3159, aux.loss_ce: 0.1158, aux.acc_seg: 88.6696, loss: 0.3333, grad_norm: 3.0505
2023-02-19 08:04:17,608 - mmseg - INFO - Iter [50450/160000]	lr: 4.108e-05, eta: 8:42:08, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.2000, decode.acc_seg: 91.9712, aux.loss_ce: 0.1042, aux.acc_seg: 89.3987, loss: 0.3043, grad_norm: 3.6682
2023-02-19 08:04:31,611 - mmseg - INFO - Iter [50500/160000]	lr: 4.106e-05, eta: 8:41:53, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1812, decode.acc_seg: 92.6964, aux.loss_ce: 0.1023, aux.acc_seg: 89.6399, loss: 0.2835, grad_norm: 2.7889
2023-02-19 08:04:47,437 - mmseg - INFO - Iter [50550/160000]	lr: 4.104e-05, eta: 8:41:42, time: 0.317, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1753, decode.acc_seg: 92.7752, aux.loss_ce: 0.0944, aux.acc_seg: 90.3070, loss: 0.2697, grad_norm: 2.2932
2023-02-19 08:05:01,338 - mmseg - INFO - Iter [50600/160000]	lr: 4.103e-05, eta: 8:41:27, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1869, decode.acc_seg: 92.4649, aux.loss_ce: 0.1021, aux.acc_seg: 89.7582, loss: 0.2890, grad_norm: 3.3937
2023-02-19 08:05:15,874 - mmseg - INFO - Iter [50650/160000]	lr: 4.101e-05, eta: 8:41:13, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1974, decode.acc_seg: 92.1623, aux.loss_ce: 0.1052, aux.acc_seg: 89.5069, loss: 0.3026, grad_norm: 2.5645
2023-02-19 08:05:29,612 - mmseg - INFO - Iter [50700/160000]	lr: 4.099e-05, eta: 8:40:58, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1768, decode.acc_seg: 92.8466, aux.loss_ce: 0.0972, aux.acc_seg: 90.2491, loss: 0.2739, grad_norm: 2.3329
2023-02-19 08:05:43,322 - mmseg - INFO - Iter [50750/160000]	lr: 4.097e-05, eta: 8:40:42, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1859, decode.acc_seg: 92.1916, aux.loss_ce: 0.0990, aux.acc_seg: 89.7262, loss: 0.2849, grad_norm: 2.3758
2023-02-19 08:05:57,189 - mmseg - INFO - Iter [50800/160000]	lr: 4.095e-05, eta: 8:40:27, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1786, decode.acc_seg: 92.8001, aux.loss_ce: 0.0988, aux.acc_seg: 90.1758, loss: 0.2774, grad_norm: 2.7888
2023-02-19 08:06:11,608 - mmseg - INFO - Iter [50850/160000]	lr: 4.093e-05, eta: 8:40:13, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1725, decode.acc_seg: 92.8601, aux.loss_ce: 0.0966, aux.acc_seg: 89.9414, loss: 0.2691, grad_norm: 2.5799
2023-02-19 08:06:25,374 - mmseg - INFO - Iter [50900/160000]	lr: 4.091e-05, eta: 8:39:57, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1847, decode.acc_seg: 92.6804, aux.loss_ce: 0.0991, aux.acc_seg: 90.1239, loss: 0.2837, grad_norm: 2.4406
2023-02-19 08:06:39,855 - mmseg - INFO - Iter [50950/160000]	lr: 4.089e-05, eta: 8:39:43, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1794, decode.acc_seg: 92.7086, aux.loss_ce: 0.0973, aux.acc_seg: 90.2767, loss: 0.2767, grad_norm: 2.5348
2023-02-19 08:06:55,074 - mmseg - INFO - Saving checkpoint at 51000 iterations
2023-02-19 08:06:58,284 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:06:58,285 - mmseg - INFO - Iter [51000/160000]	lr: 4.088e-05, eta: 8:39:38, time: 0.369, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1839, decode.acc_seg: 92.4476, aux.loss_ce: 0.1004, aux.acc_seg: 89.7308, loss: 0.2842, grad_norm: 2.4491
2023-02-19 08:07:12,291 - mmseg - INFO - Iter [51050/160000]	lr: 4.086e-05, eta: 8:39:23, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1882, decode.acc_seg: 92.4754, aux.loss_ce: 0.1034, aux.acc_seg: 89.6683, loss: 0.2916, grad_norm: 2.3994
2023-02-19 08:07:26,103 - mmseg - INFO - Iter [51100/160000]	lr: 4.084e-05, eta: 8:39:08, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1834, decode.acc_seg: 92.5664, aux.loss_ce: 0.0990, aux.acc_seg: 90.0942, loss: 0.2824, grad_norm: 2.2900
2023-02-19 08:07:40,411 - mmseg - INFO - Iter [51150/160000]	lr: 4.082e-05, eta: 8:38:53, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1916, decode.acc_seg: 92.4004, aux.loss_ce: 0.1044, aux.acc_seg: 89.5661, loss: 0.2960, grad_norm: 2.9624
2023-02-19 08:07:54,151 - mmseg - INFO - Iter [51200/160000]	lr: 4.080e-05, eta: 8:38:38, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1764, decode.acc_seg: 92.8933, aux.loss_ce: 0.1016, aux.acc_seg: 89.8204, loss: 0.2780, grad_norm: 2.3639
2023-02-19 08:08:08,283 - mmseg - INFO - Iter [51250/160000]	lr: 4.078e-05, eta: 8:38:23, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1939, decode.acc_seg: 92.3068, aux.loss_ce: 0.1068, aux.acc_seg: 89.3759, loss: 0.3007, grad_norm: 2.6933
2023-02-19 08:08:21,884 - mmseg - INFO - Iter [51300/160000]	lr: 4.076e-05, eta: 8:38:08, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1845, decode.acc_seg: 92.5699, aux.loss_ce: 0.1032, aux.acc_seg: 89.7893, loss: 0.2877, grad_norm: 3.1546
2023-02-19 08:08:35,508 - mmseg - INFO - Iter [51350/160000]	lr: 4.074e-05, eta: 8:37:52, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1848, decode.acc_seg: 92.2939, aux.loss_ce: 0.0985, aux.acc_seg: 89.9463, loss: 0.2832, grad_norm: 2.1100
2023-02-19 08:08:49,122 - mmseg - INFO - Iter [51400/160000]	lr: 4.073e-05, eta: 8:37:36, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1953, decode.acc_seg: 92.0564, aux.loss_ce: 0.1062, aux.acc_seg: 89.3842, loss: 0.3015, grad_norm: 2.7262
2023-02-19 08:09:03,350 - mmseg - INFO - Iter [51450/160000]	lr: 4.071e-05, eta: 8:37:22, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1691, decode.acc_seg: 93.1340, aux.loss_ce: 0.0944, aux.acc_seg: 90.4440, loss: 0.2634, grad_norm: 2.4165
2023-02-19 08:09:17,023 - mmseg - INFO - Iter [51500/160000]	lr: 4.069e-05, eta: 8:37:06, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1887, decode.acc_seg: 92.1191, aux.loss_ce: 0.1010, aux.acc_seg: 89.6209, loss: 0.2896, grad_norm: 2.0708
2023-02-19 08:09:30,967 - mmseg - INFO - Iter [51550/160000]	lr: 4.067e-05, eta: 8:36:51, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1988, decode.acc_seg: 92.2355, aux.loss_ce: 0.1091, aux.acc_seg: 89.1151, loss: 0.3079, grad_norm: 3.5594
2023-02-19 08:09:45,292 - mmseg - INFO - Iter [51600/160000]	lr: 4.065e-05, eta: 8:36:37, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1955, decode.acc_seg: 92.1292, aux.loss_ce: 0.1076, aux.acc_seg: 89.2235, loss: 0.3031, grad_norm: 2.9739
2023-02-19 08:09:59,548 - mmseg - INFO - Iter [51650/160000]	lr: 4.063e-05, eta: 8:36:22, time: 0.286, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1940, decode.acc_seg: 92.1452, aux.loss_ce: 0.1048, aux.acc_seg: 89.4116, loss: 0.2989, grad_norm: 2.6855
2023-02-19 08:10:13,430 - mmseg - INFO - Iter [51700/160000]	lr: 4.061e-05, eta: 8:36:07, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1903, decode.acc_seg: 92.4271, aux.loss_ce: 0.1040, aux.acc_seg: 89.6986, loss: 0.2943, grad_norm: 2.2521
2023-02-19 08:10:27,211 - mmseg - INFO - Iter [51750/160000]	lr: 4.059e-05, eta: 8:35:52, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1771, decode.acc_seg: 92.7184, aux.loss_ce: 0.0977, aux.acc_seg: 90.1120, loss: 0.2748, grad_norm: 2.5714
2023-02-19 08:10:43,254 - mmseg - INFO - Iter [51800/160000]	lr: 4.058e-05, eta: 8:35:41, time: 0.321, data_time: 0.048, memory: 15214, decode.loss_ce: 0.2110, decode.acc_seg: 91.4189, aux.loss_ce: 0.1120, aux.acc_seg: 88.9586, loss: 0.3230, grad_norm: 3.1088
2023-02-19 08:10:56,989 - mmseg - INFO - Iter [51850/160000]	lr: 4.056e-05, eta: 8:35:26, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1801, decode.acc_seg: 92.6461, aux.loss_ce: 0.0961, aux.acc_seg: 90.2959, loss: 0.2762, grad_norm: 2.4706
2023-02-19 08:11:10,850 - mmseg - INFO - Iter [51900/160000]	lr: 4.054e-05, eta: 8:35:10, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1887, decode.acc_seg: 92.2150, aux.loss_ce: 0.0999, aux.acc_seg: 89.7732, loss: 0.2885, grad_norm: 2.8923
2023-02-19 08:11:24,633 - mmseg - INFO - Iter [51950/160000]	lr: 4.052e-05, eta: 8:34:55, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1823, decode.acc_seg: 92.5315, aux.loss_ce: 0.0988, aux.acc_seg: 89.8780, loss: 0.2811, grad_norm: 2.4902
2023-02-19 08:11:38,306 - mmseg - INFO - Saving checkpoint at 52000 iterations
2023-02-19 08:11:41,523 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:11:41,523 - mmseg - INFO - Iter [52000/160000]	lr: 4.050e-05, eta: 8:34:46, time: 0.339, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1864, decode.acc_seg: 92.4807, aux.loss_ce: 0.1023, aux.acc_seg: 89.6582, loss: 0.2887, grad_norm: 2.7520
2023-02-19 08:11:55,293 - mmseg - INFO - Iter [52050/160000]	lr: 4.048e-05, eta: 8:34:31, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1885, decode.acc_seg: 92.0499, aux.loss_ce: 0.1024, aux.acc_seg: 89.2221, loss: 0.2909, grad_norm: 2.6471
2023-02-19 08:12:09,976 - mmseg - INFO - Iter [52100/160000]	lr: 4.046e-05, eta: 8:34:17, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1878, decode.acc_seg: 92.3456, aux.loss_ce: 0.1018, aux.acc_seg: 89.7031, loss: 0.2896, grad_norm: 2.9998
2023-02-19 08:12:23,771 - mmseg - INFO - Iter [52150/160000]	lr: 4.044e-05, eta: 8:34:02, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1892, decode.acc_seg: 92.3031, aux.loss_ce: 0.1026, aux.acc_seg: 89.7447, loss: 0.2918, grad_norm: 2.5499
2023-02-19 08:12:37,516 - mmseg - INFO - Iter [52200/160000]	lr: 4.043e-05, eta: 8:33:46, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1792, decode.acc_seg: 92.6147, aux.loss_ce: 0.0962, aux.acc_seg: 90.2046, loss: 0.2754, grad_norm: 2.5967
2023-02-19 08:12:51,164 - mmseg - INFO - Iter [52250/160000]	lr: 4.041e-05, eta: 8:33:31, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1742, decode.acc_seg: 92.8756, aux.loss_ce: 0.0937, aux.acc_seg: 90.4595, loss: 0.2679, grad_norm: 2.2635
2023-02-19 08:13:06,047 - mmseg - INFO - Iter [52300/160000]	lr: 4.039e-05, eta: 8:33:18, time: 0.298, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1876, decode.acc_seg: 92.3938, aux.loss_ce: 0.1011, aux.acc_seg: 89.8567, loss: 0.2888, grad_norm: 2.4762
2023-02-19 08:13:19,773 - mmseg - INFO - Iter [52350/160000]	lr: 4.037e-05, eta: 8:33:02, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1739, decode.acc_seg: 92.9958, aux.loss_ce: 0.0967, aux.acc_seg: 90.3044, loss: 0.2706, grad_norm: 2.5148
2023-02-19 08:13:33,779 - mmseg - INFO - Iter [52400/160000]	lr: 4.035e-05, eta: 8:32:47, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1773, decode.acc_seg: 92.6715, aux.loss_ce: 0.0947, aux.acc_seg: 90.2595, loss: 0.2721, grad_norm: 2.6739
2023-02-19 08:13:47,564 - mmseg - INFO - Iter [52450/160000]	lr: 4.033e-05, eta: 8:32:32, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1784, decode.acc_seg: 92.7712, aux.loss_ce: 0.0958, aux.acc_seg: 90.1564, loss: 0.2742, grad_norm: 2.3929
2023-02-19 08:14:01,138 - mmseg - INFO - Iter [52500/160000]	lr: 4.031e-05, eta: 8:32:16, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1892, decode.acc_seg: 92.4731, aux.loss_ce: 0.1022, aux.acc_seg: 89.7752, loss: 0.2915, grad_norm: 2.5269
2023-02-19 08:14:15,305 - mmseg - INFO - Iter [52550/160000]	lr: 4.029e-05, eta: 8:32:02, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1829, decode.acc_seg: 92.4412, aux.loss_ce: 0.1008, aux.acc_seg: 89.5871, loss: 0.2837, grad_norm: 2.5426
2023-02-19 08:14:29,618 - mmseg - INFO - Iter [52600/160000]	lr: 4.028e-05, eta: 8:31:47, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1843, decode.acc_seg: 92.5086, aux.loss_ce: 0.1020, aux.acc_seg: 89.5900, loss: 0.2863, grad_norm: 2.2985
2023-02-19 08:14:43,673 - mmseg - INFO - Iter [52650/160000]	lr: 4.026e-05, eta: 8:31:33, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1815, decode.acc_seg: 92.5397, aux.loss_ce: 0.0985, aux.acc_seg: 90.0045, loss: 0.2800, grad_norm: 2.5861
2023-02-19 08:14:57,274 - mmseg - INFO - Iter [52700/160000]	lr: 4.024e-05, eta: 8:31:17, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1913, decode.acc_seg: 92.3818, aux.loss_ce: 0.1026, aux.acc_seg: 89.7940, loss: 0.2939, grad_norm: 2.2822
2023-02-19 08:15:11,127 - mmseg - INFO - Iter [52750/160000]	lr: 4.022e-05, eta: 8:31:02, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1841, decode.acc_seg: 92.4924, aux.loss_ce: 0.1000, aux.acc_seg: 89.8845, loss: 0.2840, grad_norm: 2.1646
2023-02-19 08:15:24,697 - mmseg - INFO - Iter [52800/160000]	lr: 4.020e-05, eta: 8:30:46, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1855, decode.acc_seg: 92.4183, aux.loss_ce: 0.1000, aux.acc_seg: 89.8482, loss: 0.2854, grad_norm: 2.7429
2023-02-19 08:15:38,825 - mmseg - INFO - Iter [52850/160000]	lr: 4.018e-05, eta: 8:30:31, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1966, decode.acc_seg: 92.0487, aux.loss_ce: 0.1038, aux.acc_seg: 89.4593, loss: 0.3004, grad_norm: 3.1955
2023-02-19 08:15:52,573 - mmseg - INFO - Iter [52900/160000]	lr: 4.016e-05, eta: 8:30:16, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1861, decode.acc_seg: 92.4151, aux.loss_ce: 0.1015, aux.acc_seg: 89.7600, loss: 0.2876, grad_norm: 2.7127
2023-02-19 08:16:06,135 - mmseg - INFO - Iter [52950/160000]	lr: 4.014e-05, eta: 8:30:00, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1802, decode.acc_seg: 92.7198, aux.loss_ce: 0.1002, aux.acc_seg: 90.0973, loss: 0.2804, grad_norm: 2.6951
2023-02-19 08:16:19,753 - mmseg - INFO - Saving checkpoint at 53000 iterations
2023-02-19 08:16:22,960 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:16:22,961 - mmseg - INFO - Iter [53000/160000]	lr: 4.013e-05, eta: 8:29:51, time: 0.337, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1893, decode.acc_seg: 92.3981, aux.loss_ce: 0.1032, aux.acc_seg: 89.6666, loss: 0.2925, grad_norm: 2.5204
2023-02-19 08:16:40,210 - mmseg - INFO - Iter [53050/160000]	lr: 4.011e-05, eta: 8:29:43, time: 0.345, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1810, decode.acc_seg: 92.5447, aux.loss_ce: 0.1005, aux.acc_seg: 89.7220, loss: 0.2815, grad_norm: 2.6486
2023-02-19 08:16:54,318 - mmseg - INFO - Iter [53100/160000]	lr: 4.009e-05, eta: 8:29:28, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1887, decode.acc_seg: 92.6782, aux.loss_ce: 0.1041, aux.acc_seg: 89.8227, loss: 0.2928, grad_norm: 3.0328
2023-02-19 08:17:07,976 - mmseg - INFO - Iter [53150/160000]	lr: 4.007e-05, eta: 8:29:12, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1763, decode.acc_seg: 92.6539, aux.loss_ce: 0.0977, aux.acc_seg: 89.9146, loss: 0.2740, grad_norm: 2.4541
2023-02-19 08:17:21,645 - mmseg - INFO - Iter [53200/160000]	lr: 4.005e-05, eta: 8:28:57, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1703, decode.acc_seg: 93.0986, aux.loss_ce: 0.0939, aux.acc_seg: 90.5867, loss: 0.2642, grad_norm: 2.2060
2023-02-19 08:17:35,498 - mmseg - INFO - Iter [53250/160000]	lr: 4.003e-05, eta: 8:28:42, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1749, decode.acc_seg: 92.8927, aux.loss_ce: 0.0945, aux.acc_seg: 90.4139, loss: 0.2694, grad_norm: 2.0891
2023-02-19 08:17:49,882 - mmseg - INFO - Iter [53300/160000]	lr: 4.001e-05, eta: 8:28:27, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1868, decode.acc_seg: 92.4238, aux.loss_ce: 0.1007, aux.acc_seg: 89.7763, loss: 0.2876, grad_norm: 2.5334
2023-02-19 08:18:03,864 - mmseg - INFO - Iter [53350/160000]	lr: 3.999e-05, eta: 8:28:13, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1947, decode.acc_seg: 92.1375, aux.loss_ce: 0.1059, aux.acc_seg: 89.3762, loss: 0.3005, grad_norm: 2.7363
2023-02-19 08:18:17,705 - mmseg - INFO - Iter [53400/160000]	lr: 3.998e-05, eta: 8:27:57, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1841, decode.acc_seg: 92.3138, aux.loss_ce: 0.0973, aux.acc_seg: 89.9559, loss: 0.2814, grad_norm: 2.3722
2023-02-19 08:18:31,741 - mmseg - INFO - Iter [53450/160000]	lr: 3.996e-05, eta: 8:27:43, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1653, decode.acc_seg: 93.2782, aux.loss_ce: 0.0909, aux.acc_seg: 90.8113, loss: 0.2562, grad_norm: 2.0084
2023-02-19 08:18:45,402 - mmseg - INFO - Iter [53500/160000]	lr: 3.994e-05, eta: 8:27:27, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1886, decode.acc_seg: 92.3653, aux.loss_ce: 0.1020, aux.acc_seg: 89.5227, loss: 0.2906, grad_norm: 2.8516
2023-02-19 08:18:59,067 - mmseg - INFO - Iter [53550/160000]	lr: 3.992e-05, eta: 8:27:11, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1753, decode.acc_seg: 92.5958, aux.loss_ce: 0.0965, aux.acc_seg: 89.9653, loss: 0.2717, grad_norm: 3.2053
2023-02-19 08:19:12,720 - mmseg - INFO - Iter [53600/160000]	lr: 3.990e-05, eta: 8:26:56, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1837, decode.acc_seg: 92.5850, aux.loss_ce: 0.1009, aux.acc_seg: 89.9380, loss: 0.2846, grad_norm: 2.7002
2023-02-19 08:19:26,710 - mmseg - INFO - Iter [53650/160000]	lr: 3.988e-05, eta: 8:26:41, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1849, decode.acc_seg: 92.7327, aux.loss_ce: 0.0993, aux.acc_seg: 90.2255, loss: 0.2842, grad_norm: 2.1697
2023-02-19 08:19:40,750 - mmseg - INFO - Iter [53700/160000]	lr: 3.986e-05, eta: 8:26:26, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1746, decode.acc_seg: 92.7992, aux.loss_ce: 0.0970, aux.acc_seg: 90.1250, loss: 0.2716, grad_norm: 2.4650
2023-02-19 08:19:55,198 - mmseg - INFO - Iter [53750/160000]	lr: 3.984e-05, eta: 8:26:12, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1903, decode.acc_seg: 92.4413, aux.loss_ce: 0.1030, aux.acc_seg: 89.8693, loss: 0.2933, grad_norm: 2.3840
2023-02-19 08:20:09,257 - mmseg - INFO - Iter [53800/160000]	lr: 3.983e-05, eta: 8:25:57, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1841, decode.acc_seg: 92.5562, aux.loss_ce: 0.1003, aux.acc_seg: 89.9357, loss: 0.2844, grad_norm: 2.5897
2023-02-19 08:20:23,102 - mmseg - INFO - Iter [53850/160000]	lr: 3.981e-05, eta: 8:25:42, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1863, decode.acc_seg: 92.2950, aux.loss_ce: 0.1013, aux.acc_seg: 89.6320, loss: 0.2875, grad_norm: 2.4600
2023-02-19 08:20:36,789 - mmseg - INFO - Iter [53900/160000]	lr: 3.979e-05, eta: 8:25:27, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1887, decode.acc_seg: 92.3665, aux.loss_ce: 0.1024, aux.acc_seg: 89.6923, loss: 0.2910, grad_norm: 2.9050
2023-02-19 08:20:50,442 - mmseg - INFO - Iter [53950/160000]	lr: 3.977e-05, eta: 8:25:11, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1899, decode.acc_seg: 92.2383, aux.loss_ce: 0.1032, aux.acc_seg: 89.6065, loss: 0.2931, grad_norm: 3.1088
2023-02-19 08:21:04,329 - mmseg - INFO - Saving checkpoint at 54000 iterations
2023-02-19 08:21:07,563 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:21:07,563 - mmseg - INFO - Iter [54000/160000]	lr: 3.975e-05, eta: 8:25:03, time: 0.343, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1793, decode.acc_seg: 92.8858, aux.loss_ce: 0.1038, aux.acc_seg: 89.9669, loss: 0.2831, grad_norm: 2.7610
2023-02-19 08:21:21,263 - mmseg - INFO - Iter [54050/160000]	lr: 3.973e-05, eta: 8:24:47, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1912, decode.acc_seg: 92.1207, aux.loss_ce: 0.1017, aux.acc_seg: 89.6353, loss: 0.2930, grad_norm: 2.4630
2023-02-19 08:21:35,100 - mmseg - INFO - Iter [54100/160000]	lr: 3.971e-05, eta: 8:24:32, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1833, decode.acc_seg: 92.6259, aux.loss_ce: 0.1012, aux.acc_seg: 90.0109, loss: 0.2844, grad_norm: 2.7560
2023-02-19 08:21:48,711 - mmseg - INFO - Iter [54150/160000]	lr: 3.969e-05, eta: 8:24:16, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1907, decode.acc_seg: 92.4743, aux.loss_ce: 0.1025, aux.acc_seg: 89.8099, loss: 0.2932, grad_norm: 3.2888
2023-02-19 08:22:02,617 - mmseg - INFO - Iter [54200/160000]	lr: 3.968e-05, eta: 8:24:01, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1776, decode.acc_seg: 92.7479, aux.loss_ce: 0.0950, aux.acc_seg: 90.1944, loss: 0.2726, grad_norm: 2.7125
2023-02-19 08:22:16,474 - mmseg - INFO - Iter [54250/160000]	lr: 3.966e-05, eta: 8:23:46, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1881, decode.acc_seg: 92.6035, aux.loss_ce: 0.1053, aux.acc_seg: 89.6294, loss: 0.2934, grad_norm: 2.9176
2023-02-19 08:22:30,100 - mmseg - INFO - Iter [54300/160000]	lr: 3.964e-05, eta: 8:23:30, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1785, decode.acc_seg: 92.8177, aux.loss_ce: 0.0985, aux.acc_seg: 90.0461, loss: 0.2770, grad_norm: 2.4488
2023-02-19 08:22:46,376 - mmseg - INFO - Iter [54350/160000]	lr: 3.962e-05, eta: 8:23:20, time: 0.325, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1868, decode.acc_seg: 92.5326, aux.loss_ce: 0.0998, aux.acc_seg: 90.1781, loss: 0.2866, grad_norm: 2.3735
2023-02-19 08:23:00,384 - mmseg - INFO - Iter [54400/160000]	lr: 3.960e-05, eta: 8:23:05, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1708, decode.acc_seg: 92.9861, aux.loss_ce: 0.0955, aux.acc_seg: 90.3542, loss: 0.2663, grad_norm: 2.4663
2023-02-19 08:23:14,326 - mmseg - INFO - Iter [54450/160000]	lr: 3.958e-05, eta: 8:22:50, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1837, decode.acc_seg: 92.3922, aux.loss_ce: 0.0996, aux.acc_seg: 89.9217, loss: 0.2833, grad_norm: 2.7113
2023-02-19 08:23:28,077 - mmseg - INFO - Iter [54500/160000]	lr: 3.956e-05, eta: 8:22:35, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1764, decode.acc_seg: 92.7903, aux.loss_ce: 0.1009, aux.acc_seg: 89.7414, loss: 0.2773, grad_norm: 2.4653
2023-02-19 08:23:42,628 - mmseg - INFO - Iter [54550/160000]	lr: 3.954e-05, eta: 8:22:21, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1855, decode.acc_seg: 92.5409, aux.loss_ce: 0.1009, aux.acc_seg: 89.8789, loss: 0.2863, grad_norm: 3.0886
2023-02-19 08:23:56,259 - mmseg - INFO - Iter [54600/160000]	lr: 3.953e-05, eta: 8:22:06, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1809, decode.acc_seg: 92.4392, aux.loss_ce: 0.0996, aux.acc_seg: 89.7300, loss: 0.2805, grad_norm: 2.4022
2023-02-19 08:24:09,983 - mmseg - INFO - Iter [54650/160000]	lr: 3.951e-05, eta: 8:21:50, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1797, decode.acc_seg: 92.6362, aux.loss_ce: 0.0998, aux.acc_seg: 89.7965, loss: 0.2796, grad_norm: 2.4307
2023-02-19 08:24:23,568 - mmseg - INFO - Iter [54700/160000]	lr: 3.949e-05, eta: 8:21:34, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1748, decode.acc_seg: 92.6939, aux.loss_ce: 0.0949, aux.acc_seg: 90.2826, loss: 0.2697, grad_norm: 2.9142
2023-02-19 08:24:37,530 - mmseg - INFO - Iter [54750/160000]	lr: 3.947e-05, eta: 8:21:19, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1716, decode.acc_seg: 92.8366, aux.loss_ce: 0.0979, aux.acc_seg: 89.9391, loss: 0.2695, grad_norm: 2.8972
2023-02-19 08:24:52,100 - mmseg - INFO - Iter [54800/160000]	lr: 3.945e-05, eta: 8:21:06, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1853, decode.acc_seg: 92.6296, aux.loss_ce: 0.1038, aux.acc_seg: 89.6576, loss: 0.2891, grad_norm: 2.8887
2023-02-19 08:25:05,824 - mmseg - INFO - Iter [54850/160000]	lr: 3.943e-05, eta: 8:20:50, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1831, decode.acc_seg: 92.3818, aux.loss_ce: 0.0981, aux.acc_seg: 89.9081, loss: 0.2812, grad_norm: 2.4131
2023-02-19 08:25:20,356 - mmseg - INFO - Iter [54900/160000]	lr: 3.941e-05, eta: 8:20:37, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1777, decode.acc_seg: 92.8008, aux.loss_ce: 0.0976, aux.acc_seg: 90.2470, loss: 0.2753, grad_norm: 2.4912
2023-02-19 08:25:34,052 - mmseg - INFO - Iter [54950/160000]	lr: 3.939e-05, eta: 8:20:21, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1878, decode.acc_seg: 92.2371, aux.loss_ce: 0.1022, aux.acc_seg: 89.5011, loss: 0.2900, grad_norm: 2.9087
2023-02-19 08:25:48,270 - mmseg - INFO - Saving checkpoint at 55000 iterations
2023-02-19 08:25:51,528 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:25:51,528 - mmseg - INFO - Iter [55000/160000]	lr: 3.938e-05, eta: 8:20:13, time: 0.350, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1846, decode.acc_seg: 92.4525, aux.loss_ce: 0.0999, aux.acc_seg: 89.9509, loss: 0.2845, grad_norm: 2.4645
2023-02-19 08:26:05,177 - mmseg - INFO - Iter [55050/160000]	lr: 3.936e-05, eta: 8:19:57, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1770, decode.acc_seg: 93.0533, aux.loss_ce: 0.0935, aux.acc_seg: 90.5905, loss: 0.2705, grad_norm: 2.6974
2023-02-19 08:26:18,811 - mmseg - INFO - Iter [55100/160000]	lr: 3.934e-05, eta: 8:19:42, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1725, decode.acc_seg: 93.0052, aux.loss_ce: 0.1023, aux.acc_seg: 89.8693, loss: 0.2748, grad_norm: 2.3036
2023-02-19 08:26:32,433 - mmseg - INFO - Iter [55150/160000]	lr: 3.932e-05, eta: 8:19:26, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1827, decode.acc_seg: 92.5180, aux.loss_ce: 0.1001, aux.acc_seg: 89.9775, loss: 0.2828, grad_norm: 2.6435
2023-02-19 08:26:46,451 - mmseg - INFO - Iter [55200/160000]	lr: 3.930e-05, eta: 8:19:12, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1809, decode.acc_seg: 92.5545, aux.loss_ce: 0.1020, aux.acc_seg: 89.7052, loss: 0.2829, grad_norm: 2.2819
2023-02-19 08:27:00,204 - mmseg - INFO - Iter [55250/160000]	lr: 3.928e-05, eta: 8:18:56, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1848, decode.acc_seg: 92.4397, aux.loss_ce: 0.1028, aux.acc_seg: 89.7158, loss: 0.2876, grad_norm: 2.5292
2023-02-19 08:27:14,697 - mmseg - INFO - Iter [55300/160000]	lr: 3.926e-05, eta: 8:18:42, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1764, decode.acc_seg: 92.7184, aux.loss_ce: 0.0953, aux.acc_seg: 90.2762, loss: 0.2717, grad_norm: 2.8340
2023-02-19 08:27:28,244 - mmseg - INFO - Iter [55350/160000]	lr: 3.924e-05, eta: 8:18:27, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1776, decode.acc_seg: 92.7720, aux.loss_ce: 0.0990, aux.acc_seg: 89.9631, loss: 0.2766, grad_norm: 2.6305
2023-02-19 08:27:42,452 - mmseg - INFO - Iter [55400/160000]	lr: 3.923e-05, eta: 8:18:12, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1904, decode.acc_seg: 92.3222, aux.loss_ce: 0.1041, aux.acc_seg: 89.7315, loss: 0.2944, grad_norm: 2.9525
2023-02-19 08:27:56,115 - mmseg - INFO - Iter [55450/160000]	lr: 3.921e-05, eta: 8:17:57, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1710, decode.acc_seg: 93.0281, aux.loss_ce: 0.0974, aux.acc_seg: 90.0305, loss: 0.2685, grad_norm: 2.8453
2023-02-19 08:28:10,220 - mmseg - INFO - Iter [55500/160000]	lr: 3.919e-05, eta: 8:17:42, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1824, decode.acc_seg: 92.6039, aux.loss_ce: 0.0981, aux.acc_seg: 90.0755, loss: 0.2805, grad_norm: 2.8062
2023-02-19 08:28:23,836 - mmseg - INFO - Iter [55550/160000]	lr: 3.917e-05, eta: 8:17:27, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1741, decode.acc_seg: 92.8318, aux.loss_ce: 0.0975, aux.acc_seg: 90.0230, loss: 0.2716, grad_norm: 2.1979
2023-02-19 08:28:40,117 - mmseg - INFO - Iter [55600/160000]	lr: 3.915e-05, eta: 8:17:16, time: 0.326, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1684, decode.acc_seg: 93.0729, aux.loss_ce: 0.0926, aux.acc_seg: 90.5100, loss: 0.2610, grad_norm: 2.4503
2023-02-19 08:28:54,127 - mmseg - INFO - Iter [55650/160000]	lr: 3.913e-05, eta: 8:17:01, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1692, decode.acc_seg: 93.1616, aux.loss_ce: 0.0978, aux.acc_seg: 90.1548, loss: 0.2670, grad_norm: 2.4131
2023-02-19 08:29:08,359 - mmseg - INFO - Iter [55700/160000]	lr: 3.911e-05, eta: 8:16:47, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1704, decode.acc_seg: 93.0200, aux.loss_ce: 0.0947, aux.acc_seg: 90.3818, loss: 0.2651, grad_norm: 2.3321
2023-02-19 08:29:21,925 - mmseg - INFO - Iter [55750/160000]	lr: 3.909e-05, eta: 8:16:31, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1719, decode.acc_seg: 92.9894, aux.loss_ce: 0.0937, aux.acc_seg: 90.5406, loss: 0.2656, grad_norm: 3.6038
2023-02-19 08:29:35,899 - mmseg - INFO - Iter [55800/160000]	lr: 3.908e-05, eta: 8:16:16, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1746, decode.acc_seg: 92.8806, aux.loss_ce: 0.0945, aux.acc_seg: 90.4616, loss: 0.2690, grad_norm: 1.9552
2023-02-19 08:29:49,661 - mmseg - INFO - Iter [55850/160000]	lr: 3.906e-05, eta: 8:16:01, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1727, decode.acc_seg: 93.0265, aux.loss_ce: 0.0966, aux.acc_seg: 90.2290, loss: 0.2693, grad_norm: 2.8159
2023-02-19 08:30:03,588 - mmseg - INFO - Iter [55900/160000]	lr: 3.904e-05, eta: 8:15:46, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1777, decode.acc_seg: 92.6189, aux.loss_ce: 0.0962, aux.acc_seg: 90.2610, loss: 0.2740, grad_norm: 2.2890
2023-02-19 08:30:17,188 - mmseg - INFO - Iter [55950/160000]	lr: 3.902e-05, eta: 8:15:30, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1793, decode.acc_seg: 92.4390, aux.loss_ce: 0.0944, aux.acc_seg: 90.1703, loss: 0.2736, grad_norm: 2.3456
2023-02-19 08:30:30,903 - mmseg - INFO - Saving checkpoint at 56000 iterations
2023-02-19 08:30:34,115 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:30:34,115 - mmseg - INFO - Iter [56000/160000]	lr: 3.900e-05, eta: 8:15:21, time: 0.339, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1725, decode.acc_seg: 93.0066, aux.loss_ce: 0.0949, aux.acc_seg: 90.4237, loss: 0.2674, grad_norm: 2.3245
2023-02-19 08:30:47,962 - mmseg - INFO - Iter [56050/160000]	lr: 3.898e-05, eta: 8:15:06, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1733, decode.acc_seg: 92.9758, aux.loss_ce: 0.0958, aux.acc_seg: 90.1571, loss: 0.2691, grad_norm: 2.5048
2023-02-19 08:31:01,750 - mmseg - INFO - Iter [56100/160000]	lr: 3.896e-05, eta: 8:14:51, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1769, decode.acc_seg: 92.6698, aux.loss_ce: 0.0982, aux.acc_seg: 90.0223, loss: 0.2751, grad_norm: 2.2046
2023-02-19 08:31:15,793 - mmseg - INFO - Iter [56150/160000]	lr: 3.894e-05, eta: 8:14:36, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1778, decode.acc_seg: 92.6099, aux.loss_ce: 0.1004, aux.acc_seg: 89.6323, loss: 0.2782, grad_norm: 2.6565
2023-02-19 08:31:30,481 - mmseg - INFO - Iter [56200/160000]	lr: 3.893e-05, eta: 8:14:22, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1752, decode.acc_seg: 92.8749, aux.loss_ce: 0.0967, aux.acc_seg: 90.2370, loss: 0.2719, grad_norm: 2.4108
2023-02-19 08:31:44,175 - mmseg - INFO - Iter [56250/160000]	lr: 3.891e-05, eta: 8:14:07, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1810, decode.acc_seg: 92.4030, aux.loss_ce: 0.0973, aux.acc_seg: 89.9992, loss: 0.2783, grad_norm: 2.2939
2023-02-19 08:31:58,728 - mmseg - INFO - Iter [56300/160000]	lr: 3.889e-05, eta: 8:13:53, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1667, decode.acc_seg: 93.1564, aux.loss_ce: 0.0945, aux.acc_seg: 90.4669, loss: 0.2612, grad_norm: 2.2081
2023-02-19 08:32:13,121 - mmseg - INFO - Iter [56350/160000]	lr: 3.887e-05, eta: 8:13:39, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1831, decode.acc_seg: 92.5625, aux.loss_ce: 0.0999, aux.acc_seg: 89.8010, loss: 0.2830, grad_norm: 2.4024
2023-02-19 08:32:28,837 - mmseg - INFO - Iter [56400/160000]	lr: 3.885e-05, eta: 8:13:28, time: 0.314, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1707, decode.acc_seg: 92.9568, aux.loss_ce: 0.0931, aux.acc_seg: 90.4701, loss: 0.2638, grad_norm: 2.2117
2023-02-19 08:32:44,583 - mmseg - INFO - Iter [56450/160000]	lr: 3.883e-05, eta: 8:13:16, time: 0.315, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1827, decode.acc_seg: 92.7602, aux.loss_ce: 0.1006, aux.acc_seg: 90.0870, loss: 0.2833, grad_norm: 2.8962
2023-02-19 08:32:58,586 - mmseg - INFO - Iter [56500/160000]	lr: 3.881e-05, eta: 8:13:01, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1837, decode.acc_seg: 92.5282, aux.loss_ce: 0.1024, aux.acc_seg: 89.7091, loss: 0.2862, grad_norm: 2.4721
2023-02-19 08:33:12,184 - mmseg - INFO - Iter [56550/160000]	lr: 3.879e-05, eta: 8:12:46, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1761, decode.acc_seg: 93.0050, aux.loss_ce: 0.0945, aux.acc_seg: 90.5697, loss: 0.2706, grad_norm: 2.3426
2023-02-19 08:33:25,853 - mmseg - INFO - Iter [56600/160000]	lr: 3.878e-05, eta: 8:12:30, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1720, decode.acc_seg: 93.0417, aux.loss_ce: 0.0961, aux.acc_seg: 90.4106, loss: 0.2681, grad_norm: 2.3033
2023-02-19 08:33:39,783 - mmseg - INFO - Iter [56650/160000]	lr: 3.876e-05, eta: 8:12:15, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1683, decode.acc_seg: 92.8451, aux.loss_ce: 0.0923, aux.acc_seg: 90.1465, loss: 0.2606, grad_norm: 2.1004
2023-02-19 08:33:53,835 - mmseg - INFO - Iter [56700/160000]	lr: 3.874e-05, eta: 8:12:00, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1625, decode.acc_seg: 93.2955, aux.loss_ce: 0.0920, aux.acc_seg: 90.7022, loss: 0.2545, grad_norm: 2.3728
2023-02-19 08:34:08,276 - mmseg - INFO - Iter [56750/160000]	lr: 3.872e-05, eta: 8:11:46, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1770, decode.acc_seg: 92.8287, aux.loss_ce: 0.0975, aux.acc_seg: 90.0294, loss: 0.2745, grad_norm: 2.8983
2023-02-19 08:34:22,134 - mmseg - INFO - Iter [56800/160000]	lr: 3.870e-05, eta: 8:11:31, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1799, decode.acc_seg: 92.8135, aux.loss_ce: 0.0985, aux.acc_seg: 90.1385, loss: 0.2784, grad_norm: 3.0276
2023-02-19 08:34:37,919 - mmseg - INFO - Iter [56850/160000]	lr: 3.868e-05, eta: 8:11:20, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1746, decode.acc_seg: 92.8404, aux.loss_ce: 0.0982, aux.acc_seg: 90.0108, loss: 0.2728, grad_norm: 2.7715
2023-02-19 08:34:51,907 - mmseg - INFO - Iter [56900/160000]	lr: 3.866e-05, eta: 8:11:05, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1678, decode.acc_seg: 93.0528, aux.loss_ce: 0.0956, aux.acc_seg: 90.2091, loss: 0.2634, grad_norm: 2.6783
2023-02-19 08:35:05,728 - mmseg - INFO - Iter [56950/160000]	lr: 3.864e-05, eta: 8:10:50, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1671, decode.acc_seg: 93.1380, aux.loss_ce: 0.0933, aux.acc_seg: 90.4353, loss: 0.2603, grad_norm: 2.4185
2023-02-19 08:35:19,697 - mmseg - INFO - Saving checkpoint at 57000 iterations
2023-02-19 08:35:22,975 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:35:22,976 - mmseg - INFO - Iter [57000/160000]	lr: 3.863e-05, eta: 8:10:41, time: 0.345, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1668, decode.acc_seg: 93.2266, aux.loss_ce: 0.0949, aux.acc_seg: 90.3642, loss: 0.2617, grad_norm: 2.2816
2023-02-19 08:35:36,712 - mmseg - INFO - Iter [57050/160000]	lr: 3.861e-05, eta: 8:10:26, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1778, decode.acc_seg: 92.7155, aux.loss_ce: 0.0976, aux.acc_seg: 90.1786, loss: 0.2754, grad_norm: 2.7630
2023-02-19 08:35:50,842 - mmseg - INFO - Iter [57100/160000]	lr: 3.859e-05, eta: 8:10:11, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1740, decode.acc_seg: 93.0696, aux.loss_ce: 0.0992, aux.acc_seg: 90.0966, loss: 0.2732, grad_norm: 2.3686
2023-02-19 08:36:04,940 - mmseg - INFO - Iter [57150/160000]	lr: 3.857e-05, eta: 8:09:56, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1743, decode.acc_seg: 93.0748, aux.loss_ce: 0.0989, aux.acc_seg: 90.0425, loss: 0.2732, grad_norm: 2.7399
2023-02-19 08:36:18,569 - mmseg - INFO - Iter [57200/160000]	lr: 3.855e-05, eta: 8:09:41, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1741, decode.acc_seg: 92.9509, aux.loss_ce: 0.0984, aux.acc_seg: 90.1143, loss: 0.2726, grad_norm: 2.4230
2023-02-19 08:36:32,318 - mmseg - INFO - Iter [57250/160000]	lr: 3.853e-05, eta: 8:09:26, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1713, decode.acc_seg: 92.9034, aux.loss_ce: 0.0937, aux.acc_seg: 90.5016, loss: 0.2650, grad_norm: 2.2042
2023-02-19 08:36:46,271 - mmseg - INFO - Iter [57300/160000]	lr: 3.851e-05, eta: 8:09:11, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1750, decode.acc_seg: 92.8049, aux.loss_ce: 0.0959, aux.acc_seg: 90.2092, loss: 0.2708, grad_norm: 2.7025
2023-02-19 08:37:00,045 - mmseg - INFO - Iter [57350/160000]	lr: 3.849e-05, eta: 8:08:55, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1701, decode.acc_seg: 92.9390, aux.loss_ce: 0.0923, aux.acc_seg: 90.5124, loss: 0.2623, grad_norm: 2.1394
2023-02-19 08:37:14,126 - mmseg - INFO - Iter [57400/160000]	lr: 3.848e-05, eta: 8:08:41, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1735, decode.acc_seg: 92.9329, aux.loss_ce: 0.0981, aux.acc_seg: 90.3576, loss: 0.2716, grad_norm: 2.3856
2023-02-19 08:37:27,948 - mmseg - INFO - Iter [57450/160000]	lr: 3.846e-05, eta: 8:08:26, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1636, decode.acc_seg: 93.2416, aux.loss_ce: 0.0900, aux.acc_seg: 90.8356, loss: 0.2536, grad_norm: 2.3561
2023-02-19 08:37:41,685 - mmseg - INFO - Iter [57500/160000]	lr: 3.844e-05, eta: 8:08:10, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1793, decode.acc_seg: 92.9782, aux.loss_ce: 0.0995, aux.acc_seg: 90.1281, loss: 0.2788, grad_norm: 2.4336
2023-02-19 08:37:55,353 - mmseg - INFO - Iter [57550/160000]	lr: 3.842e-05, eta: 8:07:55, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1667, decode.acc_seg: 93.1520, aux.loss_ce: 0.0952, aux.acc_seg: 90.1718, loss: 0.2619, grad_norm: 2.9757
2023-02-19 08:38:08,932 - mmseg - INFO - Iter [57600/160000]	lr: 3.840e-05, eta: 8:07:39, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1743, decode.acc_seg: 92.7913, aux.loss_ce: 0.0967, aux.acc_seg: 90.1367, loss: 0.2710, grad_norm: 3.1264
2023-02-19 08:38:23,171 - mmseg - INFO - Iter [57650/160000]	lr: 3.838e-05, eta: 8:07:25, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1846, decode.acc_seg: 92.6513, aux.loss_ce: 0.0986, aux.acc_seg: 90.2377, loss: 0.2832, grad_norm: 3.4540
2023-02-19 08:38:36,895 - mmseg - INFO - Iter [57700/160000]	lr: 3.836e-05, eta: 8:07:10, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1679, decode.acc_seg: 93.1633, aux.loss_ce: 0.0939, aux.acc_seg: 90.5933, loss: 0.2618, grad_norm: 2.1344
2023-02-19 08:38:50,994 - mmseg - INFO - Iter [57750/160000]	lr: 3.834e-05, eta: 8:06:55, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1720, decode.acc_seg: 92.9163, aux.loss_ce: 0.0942, aux.acc_seg: 90.4002, loss: 0.2662, grad_norm: 2.1020
2023-02-19 08:39:05,196 - mmseg - INFO - Iter [57800/160000]	lr: 3.833e-05, eta: 8:06:41, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1697, decode.acc_seg: 93.1206, aux.loss_ce: 0.0935, aux.acc_seg: 90.6068, loss: 0.2632, grad_norm: 2.3526
2023-02-19 08:39:19,175 - mmseg - INFO - Iter [57850/160000]	lr: 3.831e-05, eta: 8:06:26, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1757, decode.acc_seg: 93.0646, aux.loss_ce: 0.0957, aux.acc_seg: 90.4843, loss: 0.2714, grad_norm: 2.6329
2023-02-19 08:39:33,580 - mmseg - INFO - Iter [57900/160000]	lr: 3.829e-05, eta: 8:06:12, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1736, decode.acc_seg: 92.8517, aux.loss_ce: 0.0978, aux.acc_seg: 90.1542, loss: 0.2714, grad_norm: 2.5041
2023-02-19 08:39:47,109 - mmseg - INFO - Iter [57950/160000]	lr: 3.827e-05, eta: 8:05:56, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1762, decode.acc_seg: 92.8291, aux.loss_ce: 0.0958, aux.acc_seg: 90.3445, loss: 0.2719, grad_norm: 2.4184
2023-02-19 08:40:01,376 - mmseg - INFO - Saving checkpoint at 58000 iterations
2023-02-19 08:40:04,628 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:40:04,628 - mmseg - INFO - Iter [58000/160000]	lr: 3.825e-05, eta: 8:05:48, time: 0.351, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1769, decode.acc_seg: 92.7708, aux.loss_ce: 0.0963, aux.acc_seg: 90.2914, loss: 0.2732, grad_norm: 4.1633
2023-02-19 08:40:18,602 - mmseg - INFO - Iter [58050/160000]	lr: 3.823e-05, eta: 8:05:33, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1741, decode.acc_seg: 92.9958, aux.loss_ce: 0.0992, aux.acc_seg: 89.9770, loss: 0.2733, grad_norm: 2.4940
2023-02-19 08:40:34,421 - mmseg - INFO - Iter [58100/160000]	lr: 3.821e-05, eta: 8:05:21, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1805, decode.acc_seg: 92.7844, aux.loss_ce: 0.0996, aux.acc_seg: 90.1255, loss: 0.2801, grad_norm: 2.8033
2023-02-19 08:40:48,579 - mmseg - INFO - Iter [58150/160000]	lr: 3.819e-05, eta: 8:05:07, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1707, decode.acc_seg: 92.9716, aux.loss_ce: 0.0911, aux.acc_seg: 90.6520, loss: 0.2618, grad_norm: 2.4108
2023-02-19 08:41:02,264 - mmseg - INFO - Iter [58200/160000]	lr: 3.818e-05, eta: 8:04:51, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1796, decode.acc_seg: 92.6685, aux.loss_ce: 0.0996, aux.acc_seg: 89.8457, loss: 0.2792, grad_norm: 2.5015
2023-02-19 08:41:16,260 - mmseg - INFO - Iter [58250/160000]	lr: 3.816e-05, eta: 8:04:37, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1650, decode.acc_seg: 93.1015, aux.loss_ce: 0.0922, aux.acc_seg: 90.4927, loss: 0.2573, grad_norm: 2.2667
2023-02-19 08:41:29,891 - mmseg - INFO - Iter [58300/160000]	lr: 3.814e-05, eta: 8:04:21, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1724, decode.acc_seg: 92.8872, aux.loss_ce: 0.0960, aux.acc_seg: 90.1989, loss: 0.2684, grad_norm: 2.8640
2023-02-19 08:41:43,848 - mmseg - INFO - Iter [58350/160000]	lr: 3.812e-05, eta: 8:04:06, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1768, decode.acc_seg: 92.6008, aux.loss_ce: 0.0967, aux.acc_seg: 90.1071, loss: 0.2735, grad_norm: 3.0830
2023-02-19 08:41:57,506 - mmseg - INFO - Iter [58400/160000]	lr: 3.810e-05, eta: 8:03:51, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1711, decode.acc_seg: 92.8993, aux.loss_ce: 0.0931, aux.acc_seg: 90.3784, loss: 0.2642, grad_norm: 2.2056
2023-02-19 08:42:11,488 - mmseg - INFO - Iter [58450/160000]	lr: 3.808e-05, eta: 8:03:36, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1645, decode.acc_seg: 93.2643, aux.loss_ce: 0.0896, aux.acc_seg: 90.7676, loss: 0.2541, grad_norm: 1.8904
2023-02-19 08:42:25,836 - mmseg - INFO - Iter [58500/160000]	lr: 3.806e-05, eta: 8:03:22, time: 0.288, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1722, decode.acc_seg: 92.8299, aux.loss_ce: 0.0966, aux.acc_seg: 89.9800, loss: 0.2687, grad_norm: 2.4921
2023-02-19 08:42:39,970 - mmseg - INFO - Iter [58550/160000]	lr: 3.804e-05, eta: 8:03:07, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1767, decode.acc_seg: 92.7436, aux.loss_ce: 0.0965, aux.acc_seg: 90.1225, loss: 0.2732, grad_norm: 2.3429
2023-02-19 08:42:54,080 - mmseg - INFO - Iter [58600/160000]	lr: 3.803e-05, eta: 8:02:53, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1617, decode.acc_seg: 93.2875, aux.loss_ce: 0.0924, aux.acc_seg: 90.5665, loss: 0.2541, grad_norm: 2.1968
2023-02-19 08:43:08,329 - mmseg - INFO - Iter [58650/160000]	lr: 3.801e-05, eta: 8:02:38, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1710, decode.acc_seg: 93.1540, aux.loss_ce: 0.0977, aux.acc_seg: 90.3388, loss: 0.2688, grad_norm: 2.5996
2023-02-19 08:43:22,501 - mmseg - INFO - Iter [58700/160000]	lr: 3.799e-05, eta: 8:02:24, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1692, decode.acc_seg: 92.9221, aux.loss_ce: 0.0953, aux.acc_seg: 90.1613, loss: 0.2645, grad_norm: 3.3194
2023-02-19 08:43:36,243 - mmseg - INFO - Iter [58750/160000]	lr: 3.797e-05, eta: 8:02:09, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1721, decode.acc_seg: 92.8449, aux.loss_ce: 0.0944, aux.acc_seg: 90.2297, loss: 0.2665, grad_norm: 2.4393
2023-02-19 08:43:50,882 - mmseg - INFO - Iter [58800/160000]	lr: 3.795e-05, eta: 8:01:55, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1757, decode.acc_seg: 92.7256, aux.loss_ce: 0.0996, aux.acc_seg: 89.8228, loss: 0.2753, grad_norm: 2.5842
2023-02-19 08:44:04,850 - mmseg - INFO - Iter [58850/160000]	lr: 3.793e-05, eta: 8:01:40, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1653, decode.acc_seg: 93.2526, aux.loss_ce: 0.0931, aux.acc_seg: 90.7268, loss: 0.2584, grad_norm: 2.1550
2023-02-19 08:44:18,630 - mmseg - INFO - Iter [58900/160000]	lr: 3.791e-05, eta: 8:01:25, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1676, decode.acc_seg: 93.3115, aux.loss_ce: 0.0920, aux.acc_seg: 90.7822, loss: 0.2596, grad_norm: 2.5999
2023-02-19 08:44:33,163 - mmseg - INFO - Iter [58950/160000]	lr: 3.789e-05, eta: 8:01:11, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1560, decode.acc_seg: 93.4457, aux.loss_ce: 0.0899, aux.acc_seg: 90.7359, loss: 0.2459, grad_norm: 2.1028
2023-02-19 08:44:46,733 - mmseg - INFO - Saving checkpoint at 59000 iterations
2023-02-19 08:44:49,970 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:44:49,970 - mmseg - INFO - Iter [59000/160000]	lr: 3.788e-05, eta: 8:01:01, time: 0.336, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1625, decode.acc_seg: 93.1921, aux.loss_ce: 0.0904, aux.acc_seg: 90.6032, loss: 0.2529, grad_norm: 2.2320
2023-02-19 08:45:03,546 - mmseg - INFO - Iter [59050/160000]	lr: 3.786e-05, eta: 8:00:46, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1844, decode.acc_seg: 92.6889, aux.loss_ce: 0.0990, aux.acc_seg: 90.1654, loss: 0.2833, grad_norm: 2.8159
2023-02-19 08:45:18,154 - mmseg - INFO - Iter [59100/160000]	lr: 3.784e-05, eta: 8:00:32, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1722, decode.acc_seg: 92.8775, aux.loss_ce: 0.0965, aux.acc_seg: 90.0674, loss: 0.2687, grad_norm: 2.4161
2023-02-19 08:45:31,736 - mmseg - INFO - Iter [59150/160000]	lr: 3.782e-05, eta: 8:00:16, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1712, decode.acc_seg: 92.9185, aux.loss_ce: 0.0934, aux.acc_seg: 90.4762, loss: 0.2646, grad_norm: 2.5331
2023-02-19 08:45:45,957 - mmseg - INFO - Iter [59200/160000]	lr: 3.780e-05, eta: 8:00:02, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1630, decode.acc_seg: 93.3751, aux.loss_ce: 0.0895, aux.acc_seg: 90.8193, loss: 0.2525, grad_norm: 2.0119
2023-02-19 08:46:00,885 - mmseg - INFO - Iter [59250/160000]	lr: 3.778e-05, eta: 7:59:49, time: 0.299, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1634, decode.acc_seg: 93.2577, aux.loss_ce: 0.0907, aux.acc_seg: 90.6838, loss: 0.2541, grad_norm: 1.8638
2023-02-19 08:46:14,482 - mmseg - INFO - Iter [59300/160000]	lr: 3.776e-05, eta: 7:59:33, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1739, decode.acc_seg: 92.7977, aux.loss_ce: 0.0983, aux.acc_seg: 89.9458, loss: 0.2721, grad_norm: 2.6054
2023-02-19 08:46:28,217 - mmseg - INFO - Iter [59350/160000]	lr: 3.774e-05, eta: 7:59:18, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1619, decode.acc_seg: 93.3345, aux.loss_ce: 0.0933, aux.acc_seg: 90.4757, loss: 0.2552, grad_norm: 2.4098
2023-02-19 08:46:44,250 - mmseg - INFO - Iter [59400/160000]	lr: 3.773e-05, eta: 7:59:07, time: 0.321, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1652, decode.acc_seg: 93.2265, aux.loss_ce: 0.0921, aux.acc_seg: 90.8187, loss: 0.2573, grad_norm: 2.7462
2023-02-19 08:46:57,986 - mmseg - INFO - Iter [59450/160000]	lr: 3.771e-05, eta: 7:58:52, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1651, decode.acc_seg: 93.1848, aux.loss_ce: 0.0908, aux.acc_seg: 90.5490, loss: 0.2559, grad_norm: 2.5048
2023-02-19 08:47:11,599 - mmseg - INFO - Iter [59500/160000]	lr: 3.769e-05, eta: 7:58:36, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1652, decode.acc_seg: 93.2302, aux.loss_ce: 0.0931, aux.acc_seg: 90.4558, loss: 0.2583, grad_norm: 3.1310
2023-02-19 08:47:26,572 - mmseg - INFO - Iter [59550/160000]	lr: 3.767e-05, eta: 7:58:23, time: 0.299, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1639, decode.acc_seg: 93.1710, aux.loss_ce: 0.0905, aux.acc_seg: 90.5921, loss: 0.2544, grad_norm: 2.7971
2023-02-19 08:47:41,695 - mmseg - INFO - Iter [59600/160000]	lr: 3.765e-05, eta: 7:58:10, time: 0.303, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1859, decode.acc_seg: 92.6220, aux.loss_ce: 0.1009, aux.acc_seg: 90.2182, loss: 0.2868, grad_norm: 2.9398
2023-02-19 08:47:55,610 - mmseg - INFO - Iter [59650/160000]	lr: 3.763e-05, eta: 7:57:55, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1615, decode.acc_seg: 93.4042, aux.loss_ce: 0.0914, aux.acc_seg: 90.7201, loss: 0.2529, grad_norm: 2.6193
2023-02-19 08:48:09,336 - mmseg - INFO - Iter [59700/160000]	lr: 3.761e-05, eta: 7:57:40, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1648, decode.acc_seg: 93.2785, aux.loss_ce: 0.0927, aux.acc_seg: 90.4896, loss: 0.2575, grad_norm: 2.3005
2023-02-19 08:48:22,870 - mmseg - INFO - Iter [59750/160000]	lr: 3.759e-05, eta: 7:57:24, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1605, decode.acc_seg: 93.3531, aux.loss_ce: 0.0920, aux.acc_seg: 90.5833, loss: 0.2525, grad_norm: 2.4457
2023-02-19 08:48:36,959 - mmseg - INFO - Iter [59800/160000]	lr: 3.758e-05, eta: 7:57:10, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1689, decode.acc_seg: 92.8766, aux.loss_ce: 0.0936, aux.acc_seg: 90.2065, loss: 0.2625, grad_norm: 2.6219
2023-02-19 08:48:51,221 - mmseg - INFO - Iter [59850/160000]	lr: 3.756e-05, eta: 7:56:55, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1644, decode.acc_seg: 93.2185, aux.loss_ce: 0.0917, aux.acc_seg: 90.5926, loss: 0.2561, grad_norm: 2.6112
2023-02-19 08:49:05,055 - mmseg - INFO - Iter [59900/160000]	lr: 3.754e-05, eta: 7:56:40, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1840, decode.acc_seg: 92.7876, aux.loss_ce: 0.1008, aux.acc_seg: 90.1387, loss: 0.2848, grad_norm: 3.0652
2023-02-19 08:49:18,832 - mmseg - INFO - Iter [59950/160000]	lr: 3.752e-05, eta: 7:56:25, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1711, decode.acc_seg: 93.0211, aux.loss_ce: 0.0957, aux.acc_seg: 90.4692, loss: 0.2669, grad_norm: 2.1192
2023-02-19 08:49:33,670 - mmseg - INFO - Saving checkpoint at 60000 iterations
2023-02-19 08:49:36,959 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:49:36,959 - mmseg - INFO - Iter [60000/160000]	lr: 3.750e-05, eta: 7:56:17, time: 0.363, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1544, decode.acc_seg: 93.5999, aux.loss_ce: 0.0875, aux.acc_seg: 91.1014, loss: 0.2419, grad_norm: 1.9535
2023-02-19 08:49:50,656 - mmseg - INFO - Iter [60050/160000]	lr: 3.748e-05, eta: 7:56:02, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1582, decode.acc_seg: 93.3455, aux.loss_ce: 0.0897, aux.acc_seg: 90.6598, loss: 0.2479, grad_norm: 2.8639
2023-02-19 08:50:04,783 - mmseg - INFO - Iter [60100/160000]	lr: 3.746e-05, eta: 7:55:48, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1616, decode.acc_seg: 93.2790, aux.loss_ce: 0.0885, aux.acc_seg: 90.9780, loss: 0.2501, grad_norm: 2.3777
2023-02-19 08:50:18,514 - mmseg - INFO - Iter [60150/160000]	lr: 3.744e-05, eta: 7:55:32, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1738, decode.acc_seg: 92.8024, aux.loss_ce: 0.0968, aux.acc_seg: 90.1767, loss: 0.2705, grad_norm: 2.5834
2023-02-19 08:50:32,326 - mmseg - INFO - Iter [60200/160000]	lr: 3.743e-05, eta: 7:55:17, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1614, decode.acc_seg: 93.1841, aux.loss_ce: 0.0901, aux.acc_seg: 90.5246, loss: 0.2515, grad_norm: 2.2053
2023-02-19 08:50:47,262 - mmseg - INFO - Iter [60250/160000]	lr: 3.741e-05, eta: 7:55:04, time: 0.299, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1650, decode.acc_seg: 93.3868, aux.loss_ce: 0.0916, aux.acc_seg: 90.6610, loss: 0.2566, grad_norm: 2.5552
2023-02-19 08:51:01,843 - mmseg - INFO - Iter [60300/160000]	lr: 3.739e-05, eta: 7:54:50, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1676, decode.acc_seg: 93.0748, aux.loss_ce: 0.0916, aux.acc_seg: 90.6942, loss: 0.2591, grad_norm: 2.6790
2023-02-19 08:51:15,490 - mmseg - INFO - Iter [60350/160000]	lr: 3.737e-05, eta: 7:54:35, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1682, decode.acc_seg: 92.8983, aux.loss_ce: 0.0914, aux.acc_seg: 90.4110, loss: 0.2596, grad_norm: 2.2172
2023-02-19 08:51:30,291 - mmseg - INFO - Iter [60400/160000]	lr: 3.735e-05, eta: 7:54:21, time: 0.296, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1691, decode.acc_seg: 93.0647, aux.loss_ce: 0.0948, aux.acc_seg: 90.3114, loss: 0.2640, grad_norm: 2.3219
2023-02-19 08:51:44,048 - mmseg - INFO - Iter [60450/160000]	lr: 3.733e-05, eta: 7:54:06, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1764, decode.acc_seg: 92.8334, aux.loss_ce: 0.1015, aux.acc_seg: 89.9053, loss: 0.2779, grad_norm: 2.6778
2023-02-19 08:51:57,706 - mmseg - INFO - Iter [60500/160000]	lr: 3.731e-05, eta: 7:53:51, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1640, decode.acc_seg: 93.2451, aux.loss_ce: 0.0907, aux.acc_seg: 90.7723, loss: 0.2547, grad_norm: 2.5041
2023-02-19 08:52:11,356 - mmseg - INFO - Iter [60550/160000]	lr: 3.729e-05, eta: 7:53:36, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1814, decode.acc_seg: 92.5970, aux.loss_ce: 0.1011, aux.acc_seg: 89.6576, loss: 0.2825, grad_norm: 2.5345
2023-02-19 08:52:25,788 - mmseg - INFO - Iter [60600/160000]	lr: 3.728e-05, eta: 7:53:22, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1774, decode.acc_seg: 92.8185, aux.loss_ce: 0.1002, aux.acc_seg: 90.0375, loss: 0.2776, grad_norm: 2.8103
2023-02-19 08:52:41,789 - mmseg - INFO - Iter [60650/160000]	lr: 3.726e-05, eta: 7:53:10, time: 0.320, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1614, decode.acc_seg: 93.3291, aux.loss_ce: 0.0909, aux.acc_seg: 90.5746, loss: 0.2522, grad_norm: 2.2215
2023-02-19 08:52:56,150 - mmseg - INFO - Iter [60700/160000]	lr: 3.724e-05, eta: 7:52:56, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1592, decode.acc_seg: 93.4659, aux.loss_ce: 0.0909, aux.acc_seg: 90.6840, loss: 0.2501, grad_norm: 2.7030
2023-02-19 08:53:10,163 - mmseg - INFO - Iter [60750/160000]	lr: 3.722e-05, eta: 7:52:41, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1560, decode.acc_seg: 93.5455, aux.loss_ce: 0.0874, aux.acc_seg: 90.9759, loss: 0.2435, grad_norm: 2.4777
2023-02-19 08:53:24,005 - mmseg - INFO - Iter [60800/160000]	lr: 3.720e-05, eta: 7:52:26, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1717, decode.acc_seg: 93.1033, aux.loss_ce: 0.0921, aux.acc_seg: 90.7037, loss: 0.2638, grad_norm: 2.9939
2023-02-19 08:53:37,981 - mmseg - INFO - Iter [60850/160000]	lr: 3.718e-05, eta: 7:52:11, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1633, decode.acc_seg: 93.2034, aux.loss_ce: 0.0931, aux.acc_seg: 90.3411, loss: 0.2564, grad_norm: 2.2447
2023-02-19 08:53:51,724 - mmseg - INFO - Iter [60900/160000]	lr: 3.716e-05, eta: 7:51:56, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1592, decode.acc_seg: 93.1949, aux.loss_ce: 0.0898, aux.acc_seg: 90.5808, loss: 0.2491, grad_norm: 2.3109
2023-02-19 08:54:05,635 - mmseg - INFO - Iter [60950/160000]	lr: 3.714e-05, eta: 7:51:41, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1625, decode.acc_seg: 93.2443, aux.loss_ce: 0.0900, aux.acc_seg: 90.9948, loss: 0.2525, grad_norm: 2.4039
2023-02-19 08:54:19,319 - mmseg - INFO - Saving checkpoint at 61000 iterations
2023-02-19 08:54:22,603 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:54:22,603 - mmseg - INFO - Iter [61000/160000]	lr: 3.713e-05, eta: 7:51:31, time: 0.339, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1605, decode.acc_seg: 93.2629, aux.loss_ce: 0.0894, aux.acc_seg: 90.7742, loss: 0.2498, grad_norm: 2.3242
2023-02-19 08:54:37,164 - mmseg - INFO - Iter [61050/160000]	lr: 3.711e-05, eta: 7:51:18, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1621, decode.acc_seg: 93.4634, aux.loss_ce: 0.0939, aux.acc_seg: 90.5633, loss: 0.2560, grad_norm: 2.0533
2023-02-19 08:54:51,040 - mmseg - INFO - Iter [61100/160000]	lr: 3.709e-05, eta: 7:51:03, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1702, decode.acc_seg: 92.9453, aux.loss_ce: 0.0956, aux.acc_seg: 90.2260, loss: 0.2658, grad_norm: 2.8179
2023-02-19 08:55:05,045 - mmseg - INFO - Iter [61150/160000]	lr: 3.707e-05, eta: 7:50:48, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1611, decode.acc_seg: 93.3516, aux.loss_ce: 0.0887, aux.acc_seg: 90.8643, loss: 0.2498, grad_norm: 2.4874
2023-02-19 08:55:19,349 - mmseg - INFO - Iter [61200/160000]	lr: 3.705e-05, eta: 7:50:34, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1664, decode.acc_seg: 93.1045, aux.loss_ce: 0.0942, aux.acc_seg: 90.2976, loss: 0.2607, grad_norm: 2.7162
2023-02-19 08:55:33,809 - mmseg - INFO - Iter [61250/160000]	lr: 3.703e-05, eta: 7:50:20, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1696, decode.acc_seg: 92.8526, aux.loss_ce: 0.0927, aux.acc_seg: 90.2749, loss: 0.2624, grad_norm: 2.5225
2023-02-19 08:55:48,107 - mmseg - INFO - Iter [61300/160000]	lr: 3.701e-05, eta: 7:50:05, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1579, decode.acc_seg: 93.4377, aux.loss_ce: 0.0897, aux.acc_seg: 90.7860, loss: 0.2476, grad_norm: 2.7012
2023-02-19 08:56:01,642 - mmseg - INFO - Iter [61350/160000]	lr: 3.699e-05, eta: 7:49:50, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1665, decode.acc_seg: 93.2714, aux.loss_ce: 0.0923, aux.acc_seg: 90.7371, loss: 0.2588, grad_norm: 2.0103
2023-02-19 08:56:15,224 - mmseg - INFO - Iter [61400/160000]	lr: 3.698e-05, eta: 7:49:34, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1608, decode.acc_seg: 93.3077, aux.loss_ce: 0.0868, aux.acc_seg: 90.9858, loss: 0.2476, grad_norm: 2.0694
2023-02-19 08:56:29,251 - mmseg - INFO - Iter [61450/160000]	lr: 3.696e-05, eta: 7:49:20, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1779, decode.acc_seg: 92.5144, aux.loss_ce: 0.0969, aux.acc_seg: 89.9078, loss: 0.2748, grad_norm: 2.9180
2023-02-19 08:56:43,132 - mmseg - INFO - Iter [61500/160000]	lr: 3.694e-05, eta: 7:49:05, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1743, decode.acc_seg: 92.7559, aux.loss_ce: 0.0962, aux.acc_seg: 90.0971, loss: 0.2705, grad_norm: 3.2975
2023-02-19 08:56:56,779 - mmseg - INFO - Iter [61550/160000]	lr: 3.692e-05, eta: 7:48:49, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1548, decode.acc_seg: 93.4228, aux.loss_ce: 0.0908, aux.acc_seg: 90.5806, loss: 0.2455, grad_norm: 2.2977
2023-02-19 08:57:10,551 - mmseg - INFO - Iter [61600/160000]	lr: 3.690e-05, eta: 7:48:34, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1630, decode.acc_seg: 93.2588, aux.loss_ce: 0.0909, aux.acc_seg: 90.8161, loss: 0.2538, grad_norm: 2.3546
2023-02-19 08:57:24,100 - mmseg - INFO - Iter [61650/160000]	lr: 3.688e-05, eta: 7:48:19, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1603, decode.acc_seg: 93.3156, aux.loss_ce: 0.0891, aux.acc_seg: 90.8111, loss: 0.2494, grad_norm: 2.1547
2023-02-19 08:57:37,881 - mmseg - INFO - Iter [61700/160000]	lr: 3.686e-05, eta: 7:48:04, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1564, decode.acc_seg: 93.3847, aux.loss_ce: 0.0887, aux.acc_seg: 90.7720, loss: 0.2451, grad_norm: 2.6920
2023-02-19 08:57:51,715 - mmseg - INFO - Iter [61750/160000]	lr: 3.684e-05, eta: 7:47:49, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1736, decode.acc_seg: 92.6642, aux.loss_ce: 0.0970, aux.acc_seg: 90.0990, loss: 0.2706, grad_norm: 2.2178
2023-02-19 08:58:05,407 - mmseg - INFO - Iter [61800/160000]	lr: 3.683e-05, eta: 7:47:34, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1672, decode.acc_seg: 92.9024, aux.loss_ce: 0.0926, aux.acc_seg: 90.4099, loss: 0.2598, grad_norm: 3.1475
2023-02-19 08:58:19,704 - mmseg - INFO - Iter [61850/160000]	lr: 3.681e-05, eta: 7:47:19, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1620, decode.acc_seg: 93.3474, aux.loss_ce: 0.0906, aux.acc_seg: 90.8057, loss: 0.2526, grad_norm: 2.7504
2023-02-19 08:58:35,953 - mmseg - INFO - Iter [61900/160000]	lr: 3.679e-05, eta: 7:47:08, time: 0.325, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1611, decode.acc_seg: 93.3113, aux.loss_ce: 0.0921, aux.acc_seg: 90.4267, loss: 0.2532, grad_norm: 2.9015
2023-02-19 08:58:50,131 - mmseg - INFO - Iter [61950/160000]	lr: 3.677e-05, eta: 7:46:54, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1632, decode.acc_seg: 93.2901, aux.loss_ce: 0.0909, aux.acc_seg: 90.6563, loss: 0.2541, grad_norm: 3.3846
2023-02-19 08:59:04,695 - mmseg - INFO - Saving checkpoint at 62000 iterations
2023-02-19 08:59:07,945 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 08:59:07,945 - mmseg - INFO - Iter [62000/160000]	lr: 3.675e-05, eta: 7:46:45, time: 0.356, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1654, decode.acc_seg: 93.1340, aux.loss_ce: 0.0878, aux.acc_seg: 90.8859, loss: 0.2533, grad_norm: 2.7383
2023-02-19 08:59:21,952 - mmseg - INFO - Iter [62050/160000]	lr: 3.673e-05, eta: 7:46:30, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1543, decode.acc_seg: 93.6188, aux.loss_ce: 0.0858, aux.acc_seg: 91.1685, loss: 0.2401, grad_norm: 1.7855
2023-02-19 08:59:35,575 - mmseg - INFO - Iter [62100/160000]	lr: 3.671e-05, eta: 7:46:15, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1605, decode.acc_seg: 93.3779, aux.loss_ce: 0.0909, aux.acc_seg: 90.6529, loss: 0.2514, grad_norm: 2.1290
2023-02-19 08:59:49,965 - mmseg - INFO - Iter [62150/160000]	lr: 3.669e-05, eta: 7:46:01, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1643, decode.acc_seg: 93.2619, aux.loss_ce: 0.0928, aux.acc_seg: 90.4479, loss: 0.2572, grad_norm: 2.7684
2023-02-19 09:00:03,562 - mmseg - INFO - Iter [62200/160000]	lr: 3.668e-05, eta: 7:45:45, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1622, decode.acc_seg: 93.3926, aux.loss_ce: 0.0902, aux.acc_seg: 90.8654, loss: 0.2524, grad_norm: 2.5811
2023-02-19 09:00:17,788 - mmseg - INFO - Iter [62250/160000]	lr: 3.666e-05, eta: 7:45:31, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1659, decode.acc_seg: 93.1857, aux.loss_ce: 0.0905, aux.acc_seg: 90.7747, loss: 0.2564, grad_norm: 2.0981
2023-02-19 09:00:31,435 - mmseg - INFO - Iter [62300/160000]	lr: 3.664e-05, eta: 7:45:16, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1525, decode.acc_seg: 93.6588, aux.loss_ce: 0.0864, aux.acc_seg: 91.0293, loss: 0.2389, grad_norm: 2.2075
2023-02-19 09:00:45,288 - mmseg - INFO - Iter [62350/160000]	lr: 3.662e-05, eta: 7:45:01, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1656, decode.acc_seg: 93.3368, aux.loss_ce: 0.0932, aux.acc_seg: 90.5531, loss: 0.2588, grad_norm: 2.7612
2023-02-19 09:00:59,203 - mmseg - INFO - Iter [62400/160000]	lr: 3.660e-05, eta: 7:44:46, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1566, decode.acc_seg: 93.7276, aux.loss_ce: 0.0866, aux.acc_seg: 91.3637, loss: 0.2431, grad_norm: 2.3722
2023-02-19 09:01:12,811 - mmseg - INFO - Iter [62450/160000]	lr: 3.658e-05, eta: 7:44:31, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1604, decode.acc_seg: 93.4132, aux.loss_ce: 0.0900, aux.acc_seg: 90.7946, loss: 0.2504, grad_norm: 1.9919
2023-02-19 09:01:27,068 - mmseg - INFO - Iter [62500/160000]	lr: 3.656e-05, eta: 7:44:16, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1656, decode.acc_seg: 93.2377, aux.loss_ce: 0.0930, aux.acc_seg: 90.5055, loss: 0.2586, grad_norm: 2.4464
2023-02-19 09:01:41,415 - mmseg - INFO - Iter [62550/160000]	lr: 3.654e-05, eta: 7:44:02, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1535, decode.acc_seg: 93.7061, aux.loss_ce: 0.0884, aux.acc_seg: 90.9510, loss: 0.2419, grad_norm: 2.0225
2023-02-19 09:01:55,042 - mmseg - INFO - Iter [62600/160000]	lr: 3.653e-05, eta: 7:43:47, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1694, decode.acc_seg: 93.0644, aux.loss_ce: 0.0946, aux.acc_seg: 90.3562, loss: 0.2641, grad_norm: 3.2760
2023-02-19 09:02:08,715 - mmseg - INFO - Iter [62650/160000]	lr: 3.651e-05, eta: 7:43:31, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1653, decode.acc_seg: 93.0675, aux.loss_ce: 0.0916, aux.acc_seg: 90.5524, loss: 0.2569, grad_norm: 2.7039
2023-02-19 09:02:22,354 - mmseg - INFO - Iter [62700/160000]	lr: 3.649e-05, eta: 7:43:16, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1733, decode.acc_seg: 93.0302, aux.loss_ce: 0.0994, aux.acc_seg: 90.0793, loss: 0.2727, grad_norm: 4.2614
2023-02-19 09:02:36,568 - mmseg - INFO - Iter [62750/160000]	lr: 3.647e-05, eta: 7:43:02, time: 0.285, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1685, decode.acc_seg: 93.1335, aux.loss_ce: 0.0917, aux.acc_seg: 90.7012, loss: 0.2602, grad_norm: 2.2575
2023-02-19 09:02:50,297 - mmseg - INFO - Iter [62800/160000]	lr: 3.645e-05, eta: 7:42:47, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1717, decode.acc_seg: 93.0087, aux.loss_ce: 0.0953, aux.acc_seg: 90.4777, loss: 0.2670, grad_norm: 2.3769
2023-02-19 09:03:04,326 - mmseg - INFO - Iter [62850/160000]	lr: 3.643e-05, eta: 7:42:32, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1621, decode.acc_seg: 93.3532, aux.loss_ce: 0.0924, aux.acc_seg: 90.6276, loss: 0.2546, grad_norm: 2.1621
2023-02-19 09:03:17,963 - mmseg - INFO - Iter [62900/160000]	lr: 3.641e-05, eta: 7:42:17, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1743, decode.acc_seg: 92.8424, aux.loss_ce: 0.0981, aux.acc_seg: 90.0034, loss: 0.2724, grad_norm: 2.7943
2023-02-19 09:03:32,697 - mmseg - INFO - Iter [62950/160000]	lr: 3.639e-05, eta: 7:42:03, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1652, decode.acc_seg: 93.2498, aux.loss_ce: 0.0895, aux.acc_seg: 91.0038, loss: 0.2548, grad_norm: 2.9727
2023-02-19 09:03:46,679 - mmseg - INFO - Saving checkpoint at 63000 iterations
2023-02-19 09:03:50,000 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:03:50,001 - mmseg - INFO - Iter [63000/160000]	lr: 3.638e-05, eta: 7:41:53, time: 0.347, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1553, decode.acc_seg: 93.5019, aux.loss_ce: 0.0912, aux.acc_seg: 90.5959, loss: 0.2465, grad_norm: 3.4018
2023-02-19 09:04:04,248 - mmseg - INFO - Iter [63050/160000]	lr: 3.636e-05, eta: 7:41:39, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1601, decode.acc_seg: 93.5179, aux.loss_ce: 0.0904, aux.acc_seg: 90.8455, loss: 0.2505, grad_norm: 2.4051
2023-02-19 09:04:18,137 - mmseg - INFO - Iter [63100/160000]	lr: 3.634e-05, eta: 7:41:24, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1644, decode.acc_seg: 93.4450, aux.loss_ce: 0.0917, aux.acc_seg: 90.9097, loss: 0.2561, grad_norm: 2.9405
2023-02-19 09:04:31,877 - mmseg - INFO - Iter [63150/160000]	lr: 3.632e-05, eta: 7:41:09, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1656, decode.acc_seg: 93.3017, aux.loss_ce: 0.0925, aux.acc_seg: 90.6946, loss: 0.2581, grad_norm: 2.8843
2023-02-19 09:04:48,365 - mmseg - INFO - Iter [63200/160000]	lr: 3.630e-05, eta: 7:40:58, time: 0.330, data_time: 0.049, memory: 15214, decode.loss_ce: 0.1621, decode.acc_seg: 93.4382, aux.loss_ce: 0.0900, aux.acc_seg: 90.8792, loss: 0.2521, grad_norm: 2.3653
2023-02-19 09:05:02,161 - mmseg - INFO - Iter [63250/160000]	lr: 3.628e-05, eta: 7:40:43, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1525, decode.acc_seg: 93.6693, aux.loss_ce: 0.0880, aux.acc_seg: 90.9807, loss: 0.2405, grad_norm: 2.2320
2023-02-19 09:05:17,047 - mmseg - INFO - Iter [63300/160000]	lr: 3.626e-05, eta: 7:40:30, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1582, decode.acc_seg: 93.3970, aux.loss_ce: 0.0909, aux.acc_seg: 90.7669, loss: 0.2491, grad_norm: 2.4771
2023-02-19 09:05:31,293 - mmseg - INFO - Iter [63350/160000]	lr: 3.624e-05, eta: 7:40:15, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1523, decode.acc_seg: 93.7510, aux.loss_ce: 0.0845, aux.acc_seg: 91.3285, loss: 0.2367, grad_norm: 2.1845
2023-02-19 09:05:45,529 - mmseg - INFO - Iter [63400/160000]	lr: 3.623e-05, eta: 7:40:01, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1658, decode.acc_seg: 93.1212, aux.loss_ce: 0.0952, aux.acc_seg: 90.3622, loss: 0.2610, grad_norm: 2.7096
2023-02-19 09:05:59,867 - mmseg - INFO - Iter [63450/160000]	lr: 3.621e-05, eta: 7:39:47, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1615, decode.acc_seg: 93.3709, aux.loss_ce: 0.0919, aux.acc_seg: 90.6556, loss: 0.2534, grad_norm: 2.2982
2023-02-19 09:06:13,864 - mmseg - INFO - Iter [63500/160000]	lr: 3.619e-05, eta: 7:39:32, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1585, decode.acc_seg: 93.5331, aux.loss_ce: 0.0929, aux.acc_seg: 90.5226, loss: 0.2515, grad_norm: 2.2217
2023-02-19 09:06:27,939 - mmseg - INFO - Iter [63550/160000]	lr: 3.617e-05, eta: 7:39:18, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1632, decode.acc_seg: 93.3473, aux.loss_ce: 0.0914, aux.acc_seg: 90.6791, loss: 0.2546, grad_norm: 2.7505
2023-02-19 09:06:41,730 - mmseg - INFO - Iter [63600/160000]	lr: 3.615e-05, eta: 7:39:02, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1616, decode.acc_seg: 93.3224, aux.loss_ce: 0.0900, aux.acc_seg: 90.8512, loss: 0.2516, grad_norm: 2.3670
2023-02-19 09:06:55,604 - mmseg - INFO - Iter [63650/160000]	lr: 3.613e-05, eta: 7:38:48, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1597, decode.acc_seg: 93.3107, aux.loss_ce: 0.0893, aux.acc_seg: 90.7265, loss: 0.2489, grad_norm: 2.6616
2023-02-19 09:07:09,266 - mmseg - INFO - Iter [63700/160000]	lr: 3.611e-05, eta: 7:38:32, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1568, decode.acc_seg: 93.5442, aux.loss_ce: 0.0879, aux.acc_seg: 90.9912, loss: 0.2447, grad_norm: 1.8486
2023-02-19 09:07:22,965 - mmseg - INFO - Iter [63750/160000]	lr: 3.609e-05, eta: 7:38:17, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1578, decode.acc_seg: 93.3453, aux.loss_ce: 0.0876, aux.acc_seg: 90.8092, loss: 0.2453, grad_norm: 2.5800
2023-02-19 09:07:36,849 - mmseg - INFO - Iter [63800/160000]	lr: 3.608e-05, eta: 7:38:02, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1581, decode.acc_seg: 93.4268, aux.loss_ce: 0.0878, aux.acc_seg: 90.9969, loss: 0.2459, grad_norm: 2.3629
2023-02-19 09:07:50,454 - mmseg - INFO - Iter [63850/160000]	lr: 3.606e-05, eta: 7:37:47, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1603, decode.acc_seg: 93.2808, aux.loss_ce: 0.0892, aux.acc_seg: 90.7361, loss: 0.2495, grad_norm: 2.3401
2023-02-19 09:08:04,277 - mmseg - INFO - Iter [63900/160000]	lr: 3.604e-05, eta: 7:37:32, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1596, decode.acc_seg: 93.3604, aux.loss_ce: 0.0909, aux.acc_seg: 90.7306, loss: 0.2505, grad_norm: 2.5044
2023-02-19 09:08:18,426 - mmseg - INFO - Iter [63950/160000]	lr: 3.602e-05, eta: 7:37:18, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1498, decode.acc_seg: 93.6329, aux.loss_ce: 0.0852, aux.acc_seg: 90.9973, loss: 0.2350, grad_norm: 2.2566
2023-02-19 09:08:32,118 - mmseg - INFO - Saving checkpoint at 64000 iterations
2023-02-19 09:08:35,472 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:08:35,473 - mmseg - INFO - Iter [64000/160000]	lr: 3.600e-05, eta: 7:37:07, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1486, decode.acc_seg: 93.6735, aux.loss_ce: 0.0839, aux.acc_seg: 90.9612, loss: 0.2324, grad_norm: 2.1215
2023-02-19 09:08:49,880 - mmseg - INFO - per class results:
2023-02-19 09:08:49,885 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        | 77.56 | 85.38 |
|       building      | 84.03 | 91.75 |
|         sky         | 93.84 | 97.85 |
|        floor        | 81.43 | 92.01 |
|         tree        | 75.35 | 86.45 |
|       ceiling       | 81.67 | 96.32 |
|         road        | 83.83 | 91.31 |
|         bed         | 89.52 | 96.33 |
|      windowpane     | 61.06 | 75.62 |
|        grass        | 66.91 | 81.28 |
|       cabinet       | 59.51 | 74.19 |
|       sidewalk      | 68.29 |  81.5 |
|        person       | 81.75 | 90.83 |
|        earth        | 34.87 | 44.66 |
|         door        | 51.21 | 66.75 |
|        table        | 61.09 | 81.81 |
|       mountain      | 58.39 | 71.76 |
|        plant        | 52.34 | 63.97 |
|       curtain       |  74.3 | 86.19 |
|        chair        | 62.21 | 75.64 |
|         car         | 84.75 | 90.23 |
|        water        | 50.96 | 60.47 |
|       painting      | 72.67 | 91.69 |
|         sofa        | 73.37 | 82.82 |
|        shelf        | 44.02 | 62.45 |
|        house        | 57.75 | 71.92 |
|         sea         | 55.97 | 86.45 |
|        mirror       | 67.97 | 85.34 |
|         rug         | 59.01 | 68.43 |
|        field        | 36.04 | 66.86 |
|       armchair      | 52.92 | 75.03 |
|         seat        | 63.02 | 82.99 |
|        fence        | 43.76 | 64.52 |
|         desk        | 53.69 | 68.71 |
|         rock        | 46.61 | 74.53 |
|       wardrobe      | 43.63 | 72.45 |
|         lamp        | 63.89 |  76.0 |
|       bathtub       | 80.07 | 81.26 |
|       railing       | 37.99 | 60.14 |
|       cushion       | 59.17 | 80.65 |
|         base        | 34.34 | 42.99 |
|         box         | 28.94 | 42.88 |
|        column       | 51.73 | 70.98 |
|      signboard      | 39.57 | 58.52 |
|   chest of drawers  | 38.46 | 54.92 |
|       counter       | 35.27 | 46.89 |
|         sand        | 58.84 | 77.26 |
|         sink        |  71.5 | 78.41 |
|      skyscraper     | 60.53 | 76.88 |
|      fireplace      | 75.46 |  94.1 |
|     refrigerator    | 73.88 | 89.26 |
|      grandstand     | 38.51 | 78.62 |
|         path        | 25.48 | 42.02 |
|        stairs       |  24.0 | 28.54 |
|        runway       |  67.6 | 90.33 |
|         case        |  50.9 | 60.46 |
|      pool table     | 93.54 | 96.29 |
|        pillow       | 49.41 | 53.91 |
|     screen door     |  65.4 | 92.46 |
|       stairway      | 34.31 | 47.17 |
|        river        |  9.64 | 17.38 |
|        bridge       | 65.58 |  81.0 |
|       bookcase      | 42.34 | 69.42 |
|        blind        | 49.34 | 67.68 |
|     coffee table    | 62.51 | 71.15 |
|        toilet       | 86.03 | 89.84 |
|        flower       | 38.68 | 50.42 |
|         book        | 39.79 | 66.47 |
|         hill        | 16.25 | 26.93 |
|        bench        | 45.58 | 52.28 |
|      countertop     | 54.44 | 73.22 |
|        stove        | 80.76 | 86.58 |
|         palm        | 55.37 | 78.54 |
|    kitchen island   | 43.39 | 63.49 |
|       computer      | 77.93 | 91.58 |
|     swivel chair    | 43.16 | 57.73 |
|         boat        | 51.55 | 62.78 |
|         bar         |  22.8 | 26.96 |
|    arcade machine   | 71.32 | 75.46 |
|        hovel        | 45.69 | 54.49 |
|         bus         | 89.98 | 96.61 |
|        towel        | 72.22 | 87.83 |
|        light        |  56.1 | 63.18 |
|        truck        | 40.16 |  50.0 |
|        tower        |  28.6 | 50.99 |
|      chandelier     | 62.01 | 71.38 |
|        awning       | 31.57 |  36.5 |
|     streetlight     | 27.32 | 33.97 |
|        booth        |  40.1 | 50.18 |
| television receiver | 72.21 | 81.89 |
|       airplane      | 58.09 | 64.36 |
|      dirt track     |  8.44 | 37.62 |
|       apparel       | 36.27 | 54.21 |
|         pole        | 20.34 | 24.45 |
|         land        |  5.07 |  6.62 |
|      bannister      |  13.1 | 16.05 |
|      escalator      |  41.4 | 56.55 |
|       ottoman       | 41.47 |  53.7 |
|        bottle       | 35.22 | 72.44 |
|        buffet       | 52.26 | 63.29 |
|        poster       |  23.1 | 30.72 |
|        stage        | 13.08 | 18.07 |
|         van         | 41.75 |  59.9 |
|         ship        | 59.48 | 88.65 |
|       fountain      | 37.65 | 38.57 |
|    conveyer belt    | 83.87 | 89.28 |
|        canopy       |  36.8 |  42.5 |
|        washer       | 74.26 | 76.55 |
|      plaything      | 31.11 | 38.49 |
|    swimming pool    | 49.02 | 65.57 |
|        stool        | 42.46 | 51.53 |
|        barrel       | 53.77 | 68.11 |
|        basket       | 37.15 | 43.66 |
|      waterfall      | 50.26 | 58.05 |
|         tent        | 95.66 | 97.58 |
|         bag         | 10.16 | 10.77 |
|       minibike      | 69.17 | 89.68 |
|        cradle       | 79.55 | 86.53 |
|         oven        | 53.82 | 64.26 |
|         ball        | 54.86 | 70.86 |
|         food        | 48.75 |  52.2 |
|         step        | 10.46 |  12.1 |
|         tank        | 62.73 |  66.6 |
|      trade name     | 21.05 | 23.72 |
|      microwave      | 79.19 |  96.3 |
|         pot         | 51.84 | 59.85 |
|        animal       | 66.12 | 70.66 |
|       bicycle       | 58.37 | 77.55 |
|         lake        | 50.15 | 60.41 |
|      dishwasher     | 67.75 |  71.9 |
|        screen       | 56.92 | 71.44 |
|       blanket       |  22.2 | 25.29 |
|      sculpture      | 70.77 | 81.68 |
|         hood        | 64.69 | 95.83 |
|        sconce       | 33.49 |  36.0 |
|         vase        | 41.02 | 62.84 |
|    traffic light    | 33.63 | 38.22 |
|         tray        | 11.25 | 28.55 |
|        ashcan       | 44.02 | 50.68 |
|         fan         | 66.41 | 75.48 |
|         pier        | 36.64 | 42.85 |
|      crt screen     |  7.52 | 21.73 |
|        plate        | 54.99 | 79.33 |
|       monitor       |  6.96 |  7.93 |
|    bulletin board   | 33.61 | 37.83 |
|        shower       |  1.53 |  1.87 |
|       radiator      | 62.61 | 69.22 |
|        glass        | 15.94 | 21.51 |
|        clock        | 39.69 | 44.77 |
|         flag        | 48.02 | 56.71 |
+---------------------+-------+-------+
2023-02-19 09:08:49,886 - mmseg - INFO - Summary:
2023-02-19 09:08:49,886 - mmseg - INFO - 
+-------+-------+-------+
|  aAcc |  mIoU |  mAcc |
+-------+-------+-------+
| 83.45 | 51.09 | 63.49 |
+-------+-------+-------+
2023-02-19 09:08:53,061 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_64000.pth.
2023-02-19 09:08:53,061 - mmseg - INFO - Best mIoU is 0.5109 at 64000 iter.
2023-02-19 09:08:53,061 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:08:53,062 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8345, mIoU: 0.5109, mAcc: 0.6349, IoU.wall: 0.7756, IoU.building: 0.8403, IoU.sky: 0.9384, IoU.floor: 0.8143, IoU.tree: 0.7535, IoU.ceiling: 0.8167, IoU.road: 0.8383, IoU.bed : 0.8952, IoU.windowpane: 0.6106, IoU.grass: 0.6691, IoU.cabinet: 0.5951, IoU.sidewalk: 0.6829, IoU.person: 0.8175, IoU.earth: 0.3487, IoU.door: 0.5121, IoU.table: 0.6109, IoU.mountain: 0.5839, IoU.plant: 0.5234, IoU.curtain: 0.7430, IoU.chair: 0.6221, IoU.car: 0.8475, IoU.water: 0.5096, IoU.painting: 0.7267, IoU.sofa: 0.7337, IoU.shelf: 0.4402, IoU.house: 0.5775, IoU.sea: 0.5597, IoU.mirror: 0.6797, IoU.rug: 0.5901, IoU.field: 0.3604, IoU.armchair: 0.5292, IoU.seat: 0.6302, IoU.fence: 0.4376, IoU.desk: 0.5369, IoU.rock: 0.4661, IoU.wardrobe: 0.4363, IoU.lamp: 0.6389, IoU.bathtub: 0.8007, IoU.railing: 0.3799, IoU.cushion: 0.5917, IoU.base: 0.3434, IoU.box: 0.2894, IoU.column: 0.5173, IoU.signboard: 0.3957, IoU.chest of drawers: 0.3846, IoU.counter: 0.3527, IoU.sand: 0.5884, IoU.sink: 0.7150, IoU.skyscraper: 0.6053, IoU.fireplace: 0.7546, IoU.refrigerator: 0.7388, IoU.grandstand: 0.3851, IoU.path: 0.2548, IoU.stairs: 0.2400, IoU.runway: 0.6760, IoU.case: 0.5090, IoU.pool table: 0.9354, IoU.pillow: 0.4941, IoU.screen door: 0.6540, IoU.stairway: 0.3431, IoU.river: 0.0964, IoU.bridge: 0.6558, IoU.bookcase: 0.4234, IoU.blind: 0.4934, IoU.coffee table: 0.6251, IoU.toilet: 0.8603, IoU.flower: 0.3868, IoU.book: 0.3979, IoU.hill: 0.1625, IoU.bench: 0.4558, IoU.countertop: 0.5444, IoU.stove: 0.8076, IoU.palm: 0.5537, IoU.kitchen island: 0.4339, IoU.computer: 0.7793, IoU.swivel chair: 0.4316, IoU.boat: 0.5155, IoU.bar: 0.2280, IoU.arcade machine: 0.7132, IoU.hovel: 0.4569, IoU.bus: 0.8998, IoU.towel: 0.7222, IoU.light: 0.5610, IoU.truck: 0.4016, IoU.tower: 0.2860, IoU.chandelier: 0.6201, IoU.awning: 0.3157, IoU.streetlight: 0.2732, IoU.booth: 0.4010, IoU.television receiver: 0.7221, IoU.airplane: 0.5809, IoU.dirt track: 0.0844, IoU.apparel: 0.3627, IoU.pole: 0.2034, IoU.land: 0.0507, IoU.bannister: 0.1310, IoU.escalator: 0.4140, IoU.ottoman: 0.4147, IoU.bottle: 0.3522, IoU.buffet: 0.5226, IoU.poster: 0.2310, IoU.stage: 0.1308, IoU.van: 0.4175, IoU.ship: 0.5948, IoU.fountain: 0.3765, IoU.conveyer belt: 0.8387, IoU.canopy: 0.3680, IoU.washer: 0.7426, IoU.plaything: 0.3111, IoU.swimming pool: 0.4902, IoU.stool: 0.4246, IoU.barrel: 0.5377, IoU.basket: 0.3715, IoU.waterfall: 0.5026, IoU.tent: 0.9566, IoU.bag: 0.1016, IoU.minibike: 0.6917, IoU.cradle: 0.7955, IoU.oven: 0.5382, IoU.ball: 0.5486, IoU.food: 0.4875, IoU.step: 0.1046, IoU.tank: 0.6273, IoU.trade name: 0.2105, IoU.microwave: 0.7919, IoU.pot: 0.5184, IoU.animal: 0.6612, IoU.bicycle: 0.5837, IoU.lake: 0.5015, IoU.dishwasher: 0.6775, IoU.screen: 0.5692, IoU.blanket: 0.2220, IoU.sculpture: 0.7077, IoU.hood: 0.6469, IoU.sconce: 0.3349, IoU.vase: 0.4102, IoU.traffic light: 0.3363, IoU.tray: 0.1125, IoU.ashcan: 0.4402, IoU.fan: 0.6641, IoU.pier: 0.3664, IoU.crt screen: 0.0752, IoU.plate: 0.5499, IoU.monitor: 0.0696, IoU.bulletin board: 0.3361, IoU.shower: 0.0153, IoU.radiator: 0.6261, IoU.glass: 0.1594, IoU.clock: 0.3969, IoU.flag: 0.4802, Acc.wall: 0.8538, Acc.building: 0.9175, Acc.sky: 0.9785, Acc.floor: 0.9201, Acc.tree: 0.8645, Acc.ceiling: 0.9632, Acc.road: 0.9131, Acc.bed : 0.9633, Acc.windowpane: 0.7562, Acc.grass: 0.8128, Acc.cabinet: 0.7419, Acc.sidewalk: 0.8150, Acc.person: 0.9083, Acc.earth: 0.4466, Acc.door: 0.6675, Acc.table: 0.8181, Acc.mountain: 0.7176, Acc.plant: 0.6397, Acc.curtain: 0.8619, Acc.chair: 0.7564, Acc.car: 0.9023, Acc.water: 0.6047, Acc.painting: 0.9169, Acc.sofa: 0.8282, Acc.shelf: 0.6245, Acc.house: 0.7192, Acc.sea: 0.8645, Acc.mirror: 0.8534, Acc.rug: 0.6843, Acc.field: 0.6686, Acc.armchair: 0.7503, Acc.seat: 0.8299, Acc.fence: 0.6452, Acc.desk: 0.6871, Acc.rock: 0.7453, Acc.wardrobe: 0.7245, Acc.lamp: 0.7600, Acc.bathtub: 0.8126, Acc.railing: 0.6014, Acc.cushion: 0.8065, Acc.base: 0.4299, Acc.box: 0.4288, Acc.column: 0.7098, Acc.signboard: 0.5852, Acc.chest of drawers: 0.5492, Acc.counter: 0.4689, Acc.sand: 0.7726, Acc.sink: 0.7841, Acc.skyscraper: 0.7688, Acc.fireplace: 0.9410, Acc.refrigerator: 0.8926, Acc.grandstand: 0.7862, Acc.path: 0.4202, Acc.stairs: 0.2854, Acc.runway: 0.9033, Acc.case: 0.6046, Acc.pool table: 0.9629, Acc.pillow: 0.5391, Acc.screen door: 0.9246, Acc.stairway: 0.4717, Acc.river: 0.1738, Acc.bridge: 0.8100, Acc.bookcase: 0.6942, Acc.blind: 0.6768, Acc.coffee table: 0.7115, Acc.toilet: 0.8984, Acc.flower: 0.5042, Acc.book: 0.6647, Acc.hill: 0.2693, Acc.bench: 0.5228, Acc.countertop: 0.7322, Acc.stove: 0.8658, Acc.palm: 0.7854, Acc.kitchen island: 0.6349, Acc.computer: 0.9158, Acc.swivel chair: 0.5773, Acc.boat: 0.6278, Acc.bar: 0.2696, Acc.arcade machine: 0.7546, Acc.hovel: 0.5449, Acc.bus: 0.9661, Acc.towel: 0.8783, Acc.light: 0.6318, Acc.truck: 0.5000, Acc.tower: 0.5099, Acc.chandelier: 0.7138, Acc.awning: 0.3650, Acc.streetlight: 0.3397, Acc.booth: 0.5018, Acc.television receiver: 0.8189, Acc.airplane: 0.6436, Acc.dirt track: 0.3762, Acc.apparel: 0.5421, Acc.pole: 0.2445, Acc.land: 0.0662, Acc.bannister: 0.1605, Acc.escalator: 0.5655, Acc.ottoman: 0.5370, Acc.bottle: 0.7244, Acc.buffet: 0.6329, Acc.poster: 0.3072, Acc.stage: 0.1807, Acc.van: 0.5990, Acc.ship: 0.8865, Acc.fountain: 0.3857, Acc.conveyer belt: 0.8928, Acc.canopy: 0.4250, Acc.washer: 0.7655, Acc.plaything: 0.3849, Acc.swimming pool: 0.6557, Acc.stool: 0.5153, Acc.barrel: 0.6811, Acc.basket: 0.4366, Acc.waterfall: 0.5805, Acc.tent: 0.9758, Acc.bag: 0.1077, Acc.minibike: 0.8968, Acc.cradle: 0.8653, Acc.oven: 0.6426, Acc.ball: 0.7086, Acc.food: 0.5220, Acc.step: 0.1210, Acc.tank: 0.6660, Acc.trade name: 0.2372, Acc.microwave: 0.9630, Acc.pot: 0.5985, Acc.animal: 0.7066, Acc.bicycle: 0.7755, Acc.lake: 0.6041, Acc.dishwasher: 0.7190, Acc.screen: 0.7144, Acc.blanket: 0.2529, Acc.sculpture: 0.8168, Acc.hood: 0.9583, Acc.sconce: 0.3600, Acc.vase: 0.6284, Acc.traffic light: 0.3822, Acc.tray: 0.2855, Acc.ashcan: 0.5068, Acc.fan: 0.7548, Acc.pier: 0.4285, Acc.crt screen: 0.2173, Acc.plate: 0.7933, Acc.monitor: 0.0793, Acc.bulletin board: 0.3783, Acc.shower: 0.0187, Acc.radiator: 0.6922, Acc.glass: 0.2151, Acc.clock: 0.4477, Acc.flag: 0.5671
2023-02-19 09:09:07,475 - mmseg - INFO - Iter [64050/160000]	lr: 3.598e-05, eta: 7:37:20, time: 0.640, data_time: 0.356, memory: 15214, decode.loss_ce: 0.1597, decode.acc_seg: 93.4708, aux.loss_ce: 0.0925, aux.acc_seg: 90.4881, loss: 0.2522, grad_norm: 2.5745
2023-02-19 09:09:21,675 - mmseg - INFO - Iter [64100/160000]	lr: 3.596e-05, eta: 7:37:05, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1740, decode.acc_seg: 92.9519, aux.loss_ce: 0.0959, aux.acc_seg: 90.4620, loss: 0.2700, grad_norm: 2.8127
2023-02-19 09:09:35,645 - mmseg - INFO - Iter [64150/160000]	lr: 3.594e-05, eta: 7:36:50, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1606, decode.acc_seg: 93.5027, aux.loss_ce: 0.0918, aux.acc_seg: 90.7793, loss: 0.2524, grad_norm: 2.2274
2023-02-19 09:09:49,415 - mmseg - INFO - Iter [64200/160000]	lr: 3.593e-05, eta: 7:36:35, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1580, decode.acc_seg: 93.5914, aux.loss_ce: 0.0929, aux.acc_seg: 90.6724, loss: 0.2508, grad_norm: 2.4512
2023-02-19 09:10:03,702 - mmseg - INFO - Iter [64250/160000]	lr: 3.591e-05, eta: 7:36:21, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1692, decode.acc_seg: 92.9670, aux.loss_ce: 0.0904, aux.acc_seg: 90.6250, loss: 0.2597, grad_norm: 2.2985
2023-02-19 09:10:18,038 - mmseg - INFO - Iter [64300/160000]	lr: 3.589e-05, eta: 7:36:07, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1605, decode.acc_seg: 93.2738, aux.loss_ce: 0.0878, aux.acc_seg: 90.9459, loss: 0.2484, grad_norm: 1.9486
2023-02-19 09:10:32,024 - mmseg - INFO - Iter [64350/160000]	lr: 3.587e-05, eta: 7:35:52, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1604, decode.acc_seg: 93.4057, aux.loss_ce: 0.0905, aux.acc_seg: 90.5245, loss: 0.2509, grad_norm: 2.6543
2023-02-19 09:10:45,980 - mmseg - INFO - Iter [64400/160000]	lr: 3.585e-05, eta: 7:35:37, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1620, decode.acc_seg: 93.3806, aux.loss_ce: 0.0951, aux.acc_seg: 90.3331, loss: 0.2571, grad_norm: 2.3405
2023-02-19 09:11:02,066 - mmseg - INFO - Iter [64450/160000]	lr: 3.583e-05, eta: 7:35:26, time: 0.322, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1590, decode.acc_seg: 93.4816, aux.loss_ce: 0.0891, aux.acc_seg: 90.9910, loss: 0.2481, grad_norm: 2.3268
2023-02-19 09:11:15,973 - mmseg - INFO - Iter [64500/160000]	lr: 3.581e-05, eta: 7:35:11, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1508, decode.acc_seg: 93.6965, aux.loss_ce: 0.0867, aux.acc_seg: 91.0518, loss: 0.2375, grad_norm: 2.1904
2023-02-19 09:11:29,828 - mmseg - INFO - Iter [64550/160000]	lr: 3.579e-05, eta: 7:34:56, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1656, decode.acc_seg: 93.1448, aux.loss_ce: 0.0883, aux.acc_seg: 91.0385, loss: 0.2539, grad_norm: 2.9731
2023-02-19 09:11:43,851 - mmseg - INFO - Iter [64600/160000]	lr: 3.578e-05, eta: 7:34:41, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1562, decode.acc_seg: 93.5755, aux.loss_ce: 0.0869, aux.acc_seg: 91.1101, loss: 0.2431, grad_norm: 2.0487
2023-02-19 09:11:58,260 - mmseg - INFO - Iter [64650/160000]	lr: 3.576e-05, eta: 7:34:27, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1578, decode.acc_seg: 93.5130, aux.loss_ce: 0.0867, aux.acc_seg: 91.2114, loss: 0.2445, grad_norm: 2.0773
2023-02-19 09:12:11,973 - mmseg - INFO - Iter [64700/160000]	lr: 3.574e-05, eta: 7:34:12, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1592, decode.acc_seg: 93.1316, aux.loss_ce: 0.0906, aux.acc_seg: 90.3050, loss: 0.2498, grad_norm: 2.2628
2023-02-19 09:12:26,047 - mmseg - INFO - Iter [64750/160000]	lr: 3.572e-05, eta: 7:33:57, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1564, decode.acc_seg: 93.4762, aux.loss_ce: 0.0872, aux.acc_seg: 91.0431, loss: 0.2437, grad_norm: 2.2599
2023-02-19 09:12:39,643 - mmseg - INFO - Iter [64800/160000]	lr: 3.570e-05, eta: 7:33:42, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1520, decode.acc_seg: 93.6437, aux.loss_ce: 0.0840, aux.acc_seg: 91.2848, loss: 0.2360, grad_norm: 2.1062
2023-02-19 09:12:53,533 - mmseg - INFO - Iter [64850/160000]	lr: 3.568e-05, eta: 7:33:27, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1728, decode.acc_seg: 93.1898, aux.loss_ce: 0.0977, aux.acc_seg: 90.5539, loss: 0.2705, grad_norm: 2.6073
2023-02-19 09:13:07,744 - mmseg - INFO - Iter [64900/160000]	lr: 3.566e-05, eta: 7:33:12, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1470, decode.acc_seg: 93.9053, aux.loss_ce: 0.0830, aux.acc_seg: 91.4722, loss: 0.2300, grad_norm: 2.2301
2023-02-19 09:13:21,929 - mmseg - INFO - Iter [64950/160000]	lr: 3.564e-05, eta: 7:32:58, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1606, decode.acc_seg: 93.3950, aux.loss_ce: 0.0920, aux.acc_seg: 90.5632, loss: 0.2526, grad_norm: 2.0333
2023-02-19 09:13:36,366 - mmseg - INFO - Saving checkpoint at 65000 iterations
2023-02-19 09:13:39,679 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:13:39,679 - mmseg - INFO - Iter [65000/160000]	lr: 3.563e-05, eta: 7:32:49, time: 0.355, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1559, decode.acc_seg: 93.5785, aux.loss_ce: 0.0876, aux.acc_seg: 91.0100, loss: 0.2435, grad_norm: 2.2798
2023-02-19 09:13:54,724 - mmseg - INFO - Iter [65050/160000]	lr: 3.561e-05, eta: 7:32:35, time: 0.300, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1573, decode.acc_seg: 93.2735, aux.loss_ce: 0.0879, aux.acc_seg: 90.6649, loss: 0.2453, grad_norm: 2.4362
2023-02-19 09:14:08,424 - mmseg - INFO - Iter [65100/160000]	lr: 3.559e-05, eta: 7:32:20, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1610, decode.acc_seg: 93.2326, aux.loss_ce: 0.0899, aux.acc_seg: 90.8010, loss: 0.2510, grad_norm: 2.3083
2023-02-19 09:14:22,450 - mmseg - INFO - Iter [65150/160000]	lr: 3.557e-05, eta: 7:32:06, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1739, decode.acc_seg: 92.9755, aux.loss_ce: 0.0995, aux.acc_seg: 90.1311, loss: 0.2734, grad_norm: 4.0598
2023-02-19 09:14:36,587 - mmseg - INFO - Iter [65200/160000]	lr: 3.555e-05, eta: 7:31:51, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1532, decode.acc_seg: 93.6920, aux.loss_ce: 0.0881, aux.acc_seg: 90.9250, loss: 0.2413, grad_norm: 2.8026
2023-02-19 09:14:50,652 - mmseg - INFO - Iter [65250/160000]	lr: 3.553e-05, eta: 7:31:36, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1534, decode.acc_seg: 93.7210, aux.loss_ce: 0.0877, aux.acc_seg: 91.1759, loss: 0.2411, grad_norm: 2.1721
2023-02-19 09:15:04,707 - mmseg - INFO - Iter [65300/160000]	lr: 3.551e-05, eta: 7:31:22, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1550, decode.acc_seg: 93.5440, aux.loss_ce: 0.0903, aux.acc_seg: 90.6023, loss: 0.2453, grad_norm: 2.2936
2023-02-19 09:15:19,105 - mmseg - INFO - Iter [65350/160000]	lr: 3.549e-05, eta: 7:31:08, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1546, decode.acc_seg: 93.6220, aux.loss_ce: 0.0867, aux.acc_seg: 90.9199, loss: 0.2413, grad_norm: 1.9571
2023-02-19 09:15:32,845 - mmseg - INFO - Iter [65400/160000]	lr: 3.548e-05, eta: 7:30:53, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1605, decode.acc_seg: 93.5640, aux.loss_ce: 0.0892, aux.acc_seg: 90.9547, loss: 0.2497, grad_norm: 2.6262
2023-02-19 09:15:46,513 - mmseg - INFO - Iter [65450/160000]	lr: 3.546e-05, eta: 7:30:37, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1609, decode.acc_seg: 93.5540, aux.loss_ce: 0.0880, aux.acc_seg: 91.2458, loss: 0.2489, grad_norm: 2.4217
2023-02-19 09:16:00,381 - mmseg - INFO - Iter [65500/160000]	lr: 3.544e-05, eta: 7:30:22, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1602, decode.acc_seg: 93.5229, aux.loss_ce: 0.0924, aux.acc_seg: 90.8403, loss: 0.2526, grad_norm: 2.3350
2023-02-19 09:16:14,365 - mmseg - INFO - Iter [65550/160000]	lr: 3.542e-05, eta: 7:30:08, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1567, decode.acc_seg: 93.4335, aux.loss_ce: 0.0875, aux.acc_seg: 90.8620, loss: 0.2443, grad_norm: 2.4195
2023-02-19 09:16:28,536 - mmseg - INFO - Iter [65600/160000]	lr: 3.540e-05, eta: 7:29:53, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1692, decode.acc_seg: 93.0470, aux.loss_ce: 0.0927, aux.acc_seg: 90.5318, loss: 0.2619, grad_norm: 2.8674
2023-02-19 09:16:42,611 - mmseg - INFO - Iter [65650/160000]	lr: 3.538e-05, eta: 7:29:39, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1596, decode.acc_seg: 93.2533, aux.loss_ce: 0.0893, aux.acc_seg: 90.6823, loss: 0.2489, grad_norm: 2.3966
2023-02-19 09:16:58,766 - mmseg - INFO - Iter [65700/160000]	lr: 3.536e-05, eta: 7:29:27, time: 0.323, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1614, decode.acc_seg: 93.4716, aux.loss_ce: 0.0897, aux.acc_seg: 91.0283, loss: 0.2511, grad_norm: 2.3329
2023-02-19 09:17:12,564 - mmseg - INFO - Iter [65750/160000]	lr: 3.534e-05, eta: 7:29:12, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1595, decode.acc_seg: 93.3069, aux.loss_ce: 0.0908, aux.acc_seg: 90.6304, loss: 0.2503, grad_norm: 2.6062
2023-02-19 09:17:26,304 - mmseg - INFO - Iter [65800/160000]	lr: 3.533e-05, eta: 7:28:57, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1600, decode.acc_seg: 93.3314, aux.loss_ce: 0.0926, aux.acc_seg: 90.5340, loss: 0.2526, grad_norm: 2.4461
2023-02-19 09:17:40,137 - mmseg - INFO - Iter [65850/160000]	lr: 3.531e-05, eta: 7:28:42, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1639, decode.acc_seg: 93.6127, aux.loss_ce: 0.0906, aux.acc_seg: 91.2184, loss: 0.2545, grad_norm: 2.8766
2023-02-19 09:17:53,869 - mmseg - INFO - Iter [65900/160000]	lr: 3.529e-05, eta: 7:28:27, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1618, decode.acc_seg: 93.3595, aux.loss_ce: 0.0910, aux.acc_seg: 90.7142, loss: 0.2529, grad_norm: 2.4521
2023-02-19 09:18:07,892 - mmseg - INFO - Iter [65950/160000]	lr: 3.527e-05, eta: 7:28:12, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1497, decode.acc_seg: 93.6500, aux.loss_ce: 0.0848, aux.acc_seg: 91.2600, loss: 0.2345, grad_norm: 1.9390
2023-02-19 09:18:22,109 - mmseg - INFO - Saving checkpoint at 66000 iterations
2023-02-19 09:18:25,325 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:18:25,325 - mmseg - INFO - Iter [66000/160000]	lr: 3.525e-05, eta: 7:28:02, time: 0.349, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1669, decode.acc_seg: 93.0225, aux.loss_ce: 0.0951, aux.acc_seg: 90.2030, loss: 0.2619, grad_norm: 2.2500
2023-02-19 09:18:40,330 - mmseg - INFO - Iter [66050/160000]	lr: 3.523e-05, eta: 7:27:49, time: 0.299, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1636, decode.acc_seg: 93.1696, aux.loss_ce: 0.0893, aux.acc_seg: 90.6857, loss: 0.2529, grad_norm: 2.5688
2023-02-19 09:18:54,177 - mmseg - INFO - Iter [66100/160000]	lr: 3.521e-05, eta: 7:27:34, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1588, decode.acc_seg: 93.3703, aux.loss_ce: 0.0906, aux.acc_seg: 90.6536, loss: 0.2495, grad_norm: 2.2557
2023-02-19 09:19:08,332 - mmseg - INFO - Iter [66150/160000]	lr: 3.519e-05, eta: 7:27:19, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1620, decode.acc_seg: 93.3362, aux.loss_ce: 0.0922, aux.acc_seg: 90.5957, loss: 0.2543, grad_norm: 2.5121
2023-02-19 09:19:22,126 - mmseg - INFO - Iter [66200/160000]	lr: 3.518e-05, eta: 7:27:04, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1599, decode.acc_seg: 93.4112, aux.loss_ce: 0.0901, aux.acc_seg: 90.8255, loss: 0.2500, grad_norm: 2.3110
2023-02-19 09:19:35,834 - mmseg - INFO - Iter [66250/160000]	lr: 3.516e-05, eta: 7:26:49, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1668, decode.acc_seg: 93.1747, aux.loss_ce: 0.0911, aux.acc_seg: 90.5712, loss: 0.2580, grad_norm: 3.5227
2023-02-19 09:19:49,818 - mmseg - INFO - Iter [66300/160000]	lr: 3.514e-05, eta: 7:26:35, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1617, decode.acc_seg: 93.1730, aux.loss_ce: 0.0921, aux.acc_seg: 90.4005, loss: 0.2538, grad_norm: 2.4244
2023-02-19 09:20:03,807 - mmseg - INFO - Iter [66350/160000]	lr: 3.512e-05, eta: 7:26:20, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1591, decode.acc_seg: 93.3990, aux.loss_ce: 0.0930, aux.acc_seg: 90.2689, loss: 0.2520, grad_norm: 2.2000
2023-02-19 09:20:17,650 - mmseg - INFO - Iter [66400/160000]	lr: 3.510e-05, eta: 7:26:05, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1593, decode.acc_seg: 93.5773, aux.loss_ce: 0.0917, aux.acc_seg: 90.6793, loss: 0.2510, grad_norm: 2.5983
2023-02-19 09:20:32,061 - mmseg - INFO - Iter [66450/160000]	lr: 3.508e-05, eta: 7:25:51, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1538, decode.acc_seg: 93.6131, aux.loss_ce: 0.0869, aux.acc_seg: 91.0949, loss: 0.2408, grad_norm: 1.8426
2023-02-19 09:20:45,772 - mmseg - INFO - Iter [66500/160000]	lr: 3.506e-05, eta: 7:25:36, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1471, decode.acc_seg: 93.7117, aux.loss_ce: 0.0847, aux.acc_seg: 91.0917, loss: 0.2318, grad_norm: 2.2192
2023-02-19 09:21:00,021 - mmseg - INFO - Iter [66550/160000]	lr: 3.504e-05, eta: 7:25:21, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1477, decode.acc_seg: 93.6277, aux.loss_ce: 0.0824, aux.acc_seg: 91.1842, loss: 0.2300, grad_norm: 2.3006
2023-02-19 09:21:13,872 - mmseg - INFO - Iter [66600/160000]	lr: 3.503e-05, eta: 7:25:06, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1575, decode.acc_seg: 93.4454, aux.loss_ce: 0.0878, aux.acc_seg: 90.8586, loss: 0.2453, grad_norm: 2.5009
2023-02-19 09:21:28,213 - mmseg - INFO - Iter [66650/160000]	lr: 3.501e-05, eta: 7:24:52, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1569, decode.acc_seg: 93.5458, aux.loss_ce: 0.0871, aux.acc_seg: 90.9757, loss: 0.2440, grad_norm: 2.8510
2023-02-19 09:21:42,264 - mmseg - INFO - Iter [66700/160000]	lr: 3.499e-05, eta: 7:24:37, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1612, decode.acc_seg: 93.3745, aux.loss_ce: 0.0909, aux.acc_seg: 90.7366, loss: 0.2521, grad_norm: 3.1155
2023-02-19 09:21:55,897 - mmseg - INFO - Iter [66750/160000]	lr: 3.497e-05, eta: 7:24:22, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1571, decode.acc_seg: 93.4829, aux.loss_ce: 0.0887, aux.acc_seg: 90.8783, loss: 0.2458, grad_norm: 2.1935
2023-02-19 09:22:09,725 - mmseg - INFO - Iter [66800/160000]	lr: 3.495e-05, eta: 7:24:07, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1618, decode.acc_seg: 93.5449, aux.loss_ce: 0.0893, aux.acc_seg: 90.9784, loss: 0.2511, grad_norm: 2.3858
2023-02-19 09:22:23,861 - mmseg - INFO - Iter [66850/160000]	lr: 3.493e-05, eta: 7:23:53, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1604, decode.acc_seg: 93.4599, aux.loss_ce: 0.0908, aux.acc_seg: 90.8852, loss: 0.2512, grad_norm: 2.0707
2023-02-19 09:22:37,872 - mmseg - INFO - Iter [66900/160000]	lr: 3.491e-05, eta: 7:23:38, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1507, decode.acc_seg: 93.7276, aux.loss_ce: 0.0867, aux.acc_seg: 90.9406, loss: 0.2374, grad_norm: 2.1836
2023-02-19 09:22:53,749 - mmseg - INFO - Iter [66950/160000]	lr: 3.489e-05, eta: 7:23:26, time: 0.318, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1535, decode.acc_seg: 93.6911, aux.loss_ce: 0.0878, aux.acc_seg: 91.0477, loss: 0.2413, grad_norm: 2.3021
2023-02-19 09:23:07,670 - mmseg - INFO - Saving checkpoint at 67000 iterations
2023-02-19 09:23:10,934 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:23:10,935 - mmseg - INFO - Iter [67000/160000]	lr: 3.488e-05, eta: 7:23:16, time: 0.344, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1544, decode.acc_seg: 93.6280, aux.loss_ce: 0.0871, aux.acc_seg: 90.9488, loss: 0.2414, grad_norm: 2.1752
2023-02-19 09:23:24,557 - mmseg - INFO - Iter [67050/160000]	lr: 3.486e-05, eta: 7:23:00, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1617, decode.acc_seg: 93.1864, aux.loss_ce: 0.0939, aux.acc_seg: 90.5110, loss: 0.2556, grad_norm: 2.7413
2023-02-19 09:23:38,352 - mmseg - INFO - Iter [67100/160000]	lr: 3.484e-05, eta: 7:22:45, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1542, decode.acc_seg: 93.5273, aux.loss_ce: 0.0855, aux.acc_seg: 91.1641, loss: 0.2397, grad_norm: 1.9179
2023-02-19 09:23:51,961 - mmseg - INFO - Iter [67150/160000]	lr: 3.482e-05, eta: 7:22:30, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1453, decode.acc_seg: 93.8378, aux.loss_ce: 0.0819, aux.acc_seg: 91.4023, loss: 0.2272, grad_norm: 2.0964
2023-02-19 09:24:05,660 - mmseg - INFO - Iter [67200/160000]	lr: 3.480e-05, eta: 7:22:15, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1456, decode.acc_seg: 93.9454, aux.loss_ce: 0.0828, aux.acc_seg: 91.3481, loss: 0.2284, grad_norm: 2.3123
2023-02-19 09:24:19,479 - mmseg - INFO - Iter [67250/160000]	lr: 3.478e-05, eta: 7:22:00, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1674, decode.acc_seg: 93.3261, aux.loss_ce: 0.0918, aux.acc_seg: 90.8030, loss: 0.2593, grad_norm: 2.8095
2023-02-19 09:24:33,475 - mmseg - INFO - Iter [67300/160000]	lr: 3.476e-05, eta: 7:21:45, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1572, decode.acc_seg: 93.5803, aux.loss_ce: 0.0917, aux.acc_seg: 90.7641, loss: 0.2489, grad_norm: 2.0700
2023-02-19 09:24:47,735 - mmseg - INFO - Iter [67350/160000]	lr: 3.474e-05, eta: 7:21:31, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1518, decode.acc_seg: 93.6624, aux.loss_ce: 0.0848, aux.acc_seg: 91.2709, loss: 0.2366, grad_norm: 1.9759
2023-02-19 09:25:01,296 - mmseg - INFO - Iter [67400/160000]	lr: 3.473e-05, eta: 7:21:16, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1588, decode.acc_seg: 93.5142, aux.loss_ce: 0.0870, aux.acc_seg: 91.1334, loss: 0.2457, grad_norm: 2.3867
2023-02-19 09:25:15,317 - mmseg - INFO - Iter [67450/160000]	lr: 3.471e-05, eta: 7:21:01, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1748, decode.acc_seg: 93.0142, aux.loss_ce: 0.0913, aux.acc_seg: 90.6445, loss: 0.2661, grad_norm: 2.7340
2023-02-19 09:25:28,903 - mmseg - INFO - Iter [67500/160000]	lr: 3.469e-05, eta: 7:20:46, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1434, decode.acc_seg: 93.9510, aux.loss_ce: 0.0805, aux.acc_seg: 91.5866, loss: 0.2239, grad_norm: 1.7835
2023-02-19 09:25:42,782 - mmseg - INFO - Iter [67550/160000]	lr: 3.467e-05, eta: 7:20:31, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1465, decode.acc_seg: 93.6342, aux.loss_ce: 0.0843, aux.acc_seg: 90.9411, loss: 0.2308, grad_norm: 1.9500
2023-02-19 09:25:56,528 - mmseg - INFO - Iter [67600/160000]	lr: 3.465e-05, eta: 7:20:16, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1531, decode.acc_seg: 93.7589, aux.loss_ce: 0.0842, aux.acc_seg: 91.4711, loss: 0.2373, grad_norm: 2.6332
2023-02-19 09:26:10,192 - mmseg - INFO - Iter [67650/160000]	lr: 3.463e-05, eta: 7:20:01, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1513, decode.acc_seg: 93.5595, aux.loss_ce: 0.0848, aux.acc_seg: 91.1937, loss: 0.2362, grad_norm: 2.2529
2023-02-19 09:26:24,562 - mmseg - INFO - Iter [67700/160000]	lr: 3.461e-05, eta: 7:19:47, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1487, decode.acc_seg: 93.6555, aux.loss_ce: 0.0836, aux.acc_seg: 91.2916, loss: 0.2323, grad_norm: 3.0350
2023-02-19 09:26:38,222 - mmseg - INFO - Iter [67750/160000]	lr: 3.459e-05, eta: 7:19:31, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1530, decode.acc_seg: 93.4209, aux.loss_ce: 0.0863, aux.acc_seg: 90.8773, loss: 0.2393, grad_norm: 2.2571
2023-02-19 09:26:52,512 - mmseg - INFO - Iter [67800/160000]	lr: 3.458e-05, eta: 7:19:17, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1636, decode.acc_seg: 93.3034, aux.loss_ce: 0.0919, aux.acc_seg: 90.7486, loss: 0.2555, grad_norm: 2.9980
2023-02-19 09:27:06,636 - mmseg - INFO - Iter [67850/160000]	lr: 3.456e-05, eta: 7:19:03, time: 0.283, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1456, decode.acc_seg: 93.7274, aux.loss_ce: 0.0840, aux.acc_seg: 91.1337, loss: 0.2295, grad_norm: 2.3651
2023-02-19 09:27:20,476 - mmseg - INFO - Iter [67900/160000]	lr: 3.454e-05, eta: 7:18:48, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1559, decode.acc_seg: 93.4879, aux.loss_ce: 0.0892, aux.acc_seg: 90.7813, loss: 0.2451, grad_norm: 2.8527
2023-02-19 09:27:34,932 - mmseg - INFO - Iter [67950/160000]	lr: 3.452e-05, eta: 7:18:34, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1534, decode.acc_seg: 93.6508, aux.loss_ce: 0.0878, aux.acc_seg: 90.8928, loss: 0.2412, grad_norm: 2.6480
2023-02-19 09:27:48,467 - mmseg - INFO - Saving checkpoint at 68000 iterations
2023-02-19 09:27:51,701 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:27:51,701 - mmseg - INFO - Iter [68000/160000]	lr: 3.450e-05, eta: 7:18:23, time: 0.336, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1481, decode.acc_seg: 93.8755, aux.loss_ce: 0.0851, aux.acc_seg: 91.3070, loss: 0.2331, grad_norm: 2.3783
2023-02-19 09:28:05,548 - mmseg - INFO - Iter [68050/160000]	lr: 3.448e-05, eta: 7:18:08, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1586, decode.acc_seg: 93.4766, aux.loss_ce: 0.0883, aux.acc_seg: 90.9572, loss: 0.2469, grad_norm: 2.3073
2023-02-19 09:28:19,227 - mmseg - INFO - Iter [68100/160000]	lr: 3.446e-05, eta: 7:17:53, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1562, decode.acc_seg: 93.4773, aux.loss_ce: 0.0883, aux.acc_seg: 90.9076, loss: 0.2445, grad_norm: 2.4508
2023-02-19 09:28:32,815 - mmseg - INFO - Iter [68150/160000]	lr: 3.444e-05, eta: 7:17:37, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1515, decode.acc_seg: 93.7669, aux.loss_ce: 0.0863, aux.acc_seg: 91.2990, loss: 0.2377, grad_norm: 2.0992
2023-02-19 09:28:46,437 - mmseg - INFO - Iter [68200/160000]	lr: 3.443e-05, eta: 7:17:22, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1523, decode.acc_seg: 93.6634, aux.loss_ce: 0.0880, aux.acc_seg: 90.8810, loss: 0.2402, grad_norm: 2.5979
2023-02-19 09:29:02,840 - mmseg - INFO - Iter [68250/160000]	lr: 3.441e-05, eta: 7:17:11, time: 0.328, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1558, decode.acc_seg: 93.4301, aux.loss_ce: 0.0854, aux.acc_seg: 91.2327, loss: 0.2412, grad_norm: 2.1979
2023-02-19 09:29:17,738 - mmseg - INFO - Iter [68300/160000]	lr: 3.439e-05, eta: 7:16:57, time: 0.298, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1412, decode.acc_seg: 94.1616, aux.loss_ce: 0.0805, aux.acc_seg: 91.8276, loss: 0.2218, grad_norm: 1.7664
2023-02-19 09:29:32,026 - mmseg - INFO - Iter [68350/160000]	lr: 3.437e-05, eta: 7:16:43, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1485, decode.acc_seg: 93.8833, aux.loss_ce: 0.0849, aux.acc_seg: 91.3176, loss: 0.2334, grad_norm: 1.8289
2023-02-19 09:29:46,124 - mmseg - INFO - Iter [68400/160000]	lr: 3.435e-05, eta: 7:16:28, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1531, decode.acc_seg: 93.6010, aux.loss_ce: 0.0857, aux.acc_seg: 91.1552, loss: 0.2388, grad_norm: 2.0671
2023-02-19 09:29:59,649 - mmseg - INFO - Iter [68450/160000]	lr: 3.433e-05, eta: 7:16:13, time: 0.270, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1548, decode.acc_seg: 93.3928, aux.loss_ce: 0.0890, aux.acc_seg: 90.8346, loss: 0.2438, grad_norm: 1.9358
2023-02-19 09:30:13,914 - mmseg - INFO - Iter [68500/160000]	lr: 3.431e-05, eta: 7:15:59, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1558, decode.acc_seg: 93.6580, aux.loss_ce: 0.0882, aux.acc_seg: 91.1283, loss: 0.2439, grad_norm: 2.0429
2023-02-19 09:30:27,524 - mmseg - INFO - Iter [68550/160000]	lr: 3.429e-05, eta: 7:15:43, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1532, decode.acc_seg: 93.6163, aux.loss_ce: 0.0861, aux.acc_seg: 91.0389, loss: 0.2393, grad_norm: 1.8709
2023-02-19 09:30:41,524 - mmseg - INFO - Iter [68600/160000]	lr: 3.428e-05, eta: 7:15:29, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1497, decode.acc_seg: 93.8463, aux.loss_ce: 0.0872, aux.acc_seg: 91.1387, loss: 0.2370, grad_norm: 2.3359
2023-02-19 09:30:55,452 - mmseg - INFO - Iter [68650/160000]	lr: 3.426e-05, eta: 7:15:14, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1410, decode.acc_seg: 94.0750, aux.loss_ce: 0.0826, aux.acc_seg: 91.5277, loss: 0.2236, grad_norm: 1.9722
2023-02-19 09:31:09,032 - mmseg - INFO - Iter [68700/160000]	lr: 3.424e-05, eta: 7:14:59, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1562, decode.acc_seg: 93.4213, aux.loss_ce: 0.0864, aux.acc_seg: 91.1007, loss: 0.2426, grad_norm: 2.1608
2023-02-19 09:31:22,676 - mmseg - INFO - Iter [68750/160000]	lr: 3.422e-05, eta: 7:14:44, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1543, decode.acc_seg: 93.4962, aux.loss_ce: 0.0853, aux.acc_seg: 91.1712, loss: 0.2396, grad_norm: 2.4222
2023-02-19 09:31:36,395 - mmseg - INFO - Iter [68800/160000]	lr: 3.420e-05, eta: 7:14:29, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1542, decode.acc_seg: 93.5849, aux.loss_ce: 0.0860, aux.acc_seg: 91.1813, loss: 0.2402, grad_norm: 2.0086
2023-02-19 09:31:50,278 - mmseg - INFO - Iter [68850/160000]	lr: 3.418e-05, eta: 7:14:14, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1512, decode.acc_seg: 93.7817, aux.loss_ce: 0.0854, aux.acc_seg: 91.3078, loss: 0.2366, grad_norm: 1.9791
2023-02-19 09:32:04,230 - mmseg - INFO - Iter [68900/160000]	lr: 3.416e-05, eta: 7:13:59, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1530, decode.acc_seg: 93.5733, aux.loss_ce: 0.0885, aux.acc_seg: 90.9305, loss: 0.2415, grad_norm: 2.4746
2023-02-19 09:32:17,913 - mmseg - INFO - Iter [68950/160000]	lr: 3.414e-05, eta: 7:13:44, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1503, decode.acc_seg: 93.7412, aux.loss_ce: 0.0845, aux.acc_seg: 91.2262, loss: 0.2349, grad_norm: 1.8924
2023-02-19 09:32:31,494 - mmseg - INFO - Saving checkpoint at 69000 iterations
2023-02-19 09:32:34,760 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:32:34,760 - mmseg - INFO - Iter [69000/160000]	lr: 3.413e-05, eta: 7:13:33, time: 0.337, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1439, decode.acc_seg: 93.9699, aux.loss_ce: 0.0833, aux.acc_seg: 91.4090, loss: 0.2273, grad_norm: 2.2551
2023-02-19 09:32:48,442 - mmseg - INFO - Iter [69050/160000]	lr: 3.411e-05, eta: 7:13:18, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1473, decode.acc_seg: 93.8172, aux.loss_ce: 0.0832, aux.acc_seg: 91.3213, loss: 0.2305, grad_norm: 1.9562
2023-02-19 09:33:02,106 - mmseg - INFO - Iter [69100/160000]	lr: 3.409e-05, eta: 7:13:03, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1422, decode.acc_seg: 94.1032, aux.loss_ce: 0.0811, aux.acc_seg: 91.7486, loss: 0.2233, grad_norm: 1.7567
2023-02-19 09:33:16,188 - mmseg - INFO - Iter [69150/160000]	lr: 3.407e-05, eta: 7:12:48, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1529, decode.acc_seg: 93.7808, aux.loss_ce: 0.0869, aux.acc_seg: 91.1263, loss: 0.2398, grad_norm: 2.4596
2023-02-19 09:33:31,058 - mmseg - INFO - Iter [69200/160000]	lr: 3.405e-05, eta: 7:12:35, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1594, decode.acc_seg: 93.3237, aux.loss_ce: 0.0903, aux.acc_seg: 90.6917, loss: 0.2497, grad_norm: 2.7397
2023-02-19 09:33:45,301 - mmseg - INFO - Iter [69250/160000]	lr: 3.403e-05, eta: 7:12:20, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1435, decode.acc_seg: 93.8747, aux.loss_ce: 0.0811, aux.acc_seg: 91.2999, loss: 0.2246, grad_norm: 1.8170
2023-02-19 09:33:58,995 - mmseg - INFO - Iter [69300/160000]	lr: 3.401e-05, eta: 7:12:05, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1494, decode.acc_seg: 93.7869, aux.loss_ce: 0.0824, aux.acc_seg: 91.5452, loss: 0.2318, grad_norm: 1.9005
2023-02-19 09:34:12,933 - mmseg - INFO - Iter [69350/160000]	lr: 3.399e-05, eta: 7:11:50, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1564, decode.acc_seg: 93.4395, aux.loss_ce: 0.0875, aux.acc_seg: 90.8935, loss: 0.2440, grad_norm: 2.9108
2023-02-19 09:34:26,634 - mmseg - INFO - Iter [69400/160000]	lr: 3.398e-05, eta: 7:11:35, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1484, decode.acc_seg: 93.7260, aux.loss_ce: 0.0838, aux.acc_seg: 91.3077, loss: 0.2321, grad_norm: 1.9829
2023-02-19 09:34:40,254 - mmseg - INFO - Iter [69450/160000]	lr: 3.396e-05, eta: 7:11:20, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1554, decode.acc_seg: 93.4738, aux.loss_ce: 0.0889, aux.acc_seg: 90.7967, loss: 0.2444, grad_norm: 2.5040
2023-02-19 09:34:56,331 - mmseg - INFO - Iter [69500/160000]	lr: 3.394e-05, eta: 7:11:08, time: 0.321, data_time: 0.046, memory: 15214, decode.loss_ce: 0.1454, decode.acc_seg: 93.8928, aux.loss_ce: 0.0854, aux.acc_seg: 91.0949, loss: 0.2309, grad_norm: 2.1907
2023-02-19 09:35:10,600 - mmseg - INFO - Iter [69550/160000]	lr: 3.392e-05, eta: 7:10:54, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1493, decode.acc_seg: 93.5996, aux.loss_ce: 0.0826, aux.acc_seg: 91.1435, loss: 0.2319, grad_norm: 1.9014
2023-02-19 09:35:24,935 - mmseg - INFO - Iter [69600/160000]	lr: 3.390e-05, eta: 7:10:40, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1529, decode.acc_seg: 93.6195, aux.loss_ce: 0.0866, aux.acc_seg: 91.1529, loss: 0.2395, grad_norm: 1.8625
2023-02-19 09:35:38,654 - mmseg - INFO - Iter [69650/160000]	lr: 3.388e-05, eta: 7:10:25, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1415, decode.acc_seg: 93.9595, aux.loss_ce: 0.0839, aux.acc_seg: 91.3039, loss: 0.2254, grad_norm: 2.0791
2023-02-19 09:35:52,375 - mmseg - INFO - Iter [69700/160000]	lr: 3.386e-05, eta: 7:10:10, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1509, decode.acc_seg: 93.8125, aux.loss_ce: 0.0868, aux.acc_seg: 91.1357, loss: 0.2377, grad_norm: 2.1149
2023-02-19 09:36:05,979 - mmseg - INFO - Iter [69750/160000]	lr: 3.384e-05, eta: 7:09:54, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1518, decode.acc_seg: 93.8346, aux.loss_ce: 0.0857, aux.acc_seg: 91.3598, loss: 0.2375, grad_norm: 2.1707
2023-02-19 09:36:19,669 - mmseg - INFO - Iter [69800/160000]	lr: 3.383e-05, eta: 7:09:39, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1520, decode.acc_seg: 93.6905, aux.loss_ce: 0.0861, aux.acc_seg: 91.2135, loss: 0.2381, grad_norm: 2.0610
2023-02-19 09:36:33,258 - mmseg - INFO - Iter [69850/160000]	lr: 3.381e-05, eta: 7:09:24, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1463, decode.acc_seg: 93.9496, aux.loss_ce: 0.0844, aux.acc_seg: 91.3634, loss: 0.2307, grad_norm: 2.1902
2023-02-19 09:36:46,819 - mmseg - INFO - Iter [69900/160000]	lr: 3.379e-05, eta: 7:09:09, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1569, decode.acc_seg: 93.5750, aux.loss_ce: 0.0870, aux.acc_seg: 91.4021, loss: 0.2439, grad_norm: 2.2036
2023-02-19 09:37:00,553 - mmseg - INFO - Iter [69950/160000]	lr: 3.377e-05, eta: 7:08:54, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1461, decode.acc_seg: 93.8691, aux.loss_ce: 0.0809, aux.acc_seg: 91.6155, loss: 0.2270, grad_norm: 2.0007
2023-02-19 09:37:14,318 - mmseg - INFO - Saving checkpoint at 70000 iterations
2023-02-19 09:37:17,554 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:37:17,555 - mmseg - INFO - Iter [70000/160000]	lr: 3.375e-05, eta: 7:08:43, time: 0.340, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1420, decode.acc_seg: 93.9004, aux.loss_ce: 0.0812, aux.acc_seg: 91.2786, loss: 0.2232, grad_norm: 2.0143
2023-02-19 09:37:31,350 - mmseg - INFO - Iter [70050/160000]	lr: 3.373e-05, eta: 7:08:28, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1578, decode.acc_seg: 93.4886, aux.loss_ce: 0.0901, aux.acc_seg: 90.9171, loss: 0.2480, grad_norm: 2.4019
2023-02-19 09:37:45,047 - mmseg - INFO - Iter [70100/160000]	lr: 3.371e-05, eta: 7:08:13, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1498, decode.acc_seg: 93.6903, aux.loss_ce: 0.0856, aux.acc_seg: 91.1236, loss: 0.2353, grad_norm: 1.8269
2023-02-19 09:37:59,119 - mmseg - INFO - Iter [70150/160000]	lr: 3.369e-05, eta: 7:07:59, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1498, decode.acc_seg: 93.6603, aux.loss_ce: 0.0815, aux.acc_seg: 91.3853, loss: 0.2313, grad_norm: 2.3383
2023-02-19 09:38:12,919 - mmseg - INFO - Iter [70200/160000]	lr: 3.368e-05, eta: 7:07:44, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1546, decode.acc_seg: 93.6977, aux.loss_ce: 0.0874, aux.acc_seg: 91.2465, loss: 0.2420, grad_norm: 3.8998
2023-02-19 09:38:27,254 - mmseg - INFO - Iter [70250/160000]	lr: 3.366e-05, eta: 7:07:29, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1593, decode.acc_seg: 93.2938, aux.loss_ce: 0.0906, aux.acc_seg: 90.5718, loss: 0.2499, grad_norm: 3.9849
2023-02-19 09:38:41,018 - mmseg - INFO - Iter [70300/160000]	lr: 3.364e-05, eta: 7:07:15, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1593, decode.acc_seg: 93.5037, aux.loss_ce: 0.0901, aux.acc_seg: 90.7626, loss: 0.2493, grad_norm: 2.3655
2023-02-19 09:38:54,790 - mmseg - INFO - Iter [70350/160000]	lr: 3.362e-05, eta: 7:07:00, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1557, decode.acc_seg: 93.5463, aux.loss_ce: 0.0917, aux.acc_seg: 90.5039, loss: 0.2474, grad_norm: 2.2412
2023-02-19 09:39:09,228 - mmseg - INFO - Iter [70400/160000]	lr: 3.360e-05, eta: 7:06:45, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1587, decode.acc_seg: 93.3744, aux.loss_ce: 0.0915, aux.acc_seg: 90.7355, loss: 0.2502, grad_norm: 2.2878
2023-02-19 09:39:23,130 - mmseg - INFO - Iter [70450/160000]	lr: 3.358e-05, eta: 7:06:31, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1518, decode.acc_seg: 93.6611, aux.loss_ce: 0.0864, aux.acc_seg: 91.1219, loss: 0.2382, grad_norm: 2.1975
2023-02-19 09:39:37,097 - mmseg - INFO - Iter [70500/160000]	lr: 3.356e-05, eta: 7:06:16, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1454, decode.acc_seg: 93.9881, aux.loss_ce: 0.0835, aux.acc_seg: 91.4014, loss: 0.2289, grad_norm: 2.1592
2023-02-19 09:39:51,501 - mmseg - INFO - Iter [70550/160000]	lr: 3.354e-05, eta: 7:06:02, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1487, decode.acc_seg: 93.6425, aux.loss_ce: 0.0841, aux.acc_seg: 91.0760, loss: 0.2327, grad_norm: 3.0253
2023-02-19 09:40:05,921 - mmseg - INFO - Iter [70600/160000]	lr: 3.353e-05, eta: 7:05:48, time: 0.289, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1484, decode.acc_seg: 93.8163, aux.loss_ce: 0.0859, aux.acc_seg: 91.2406, loss: 0.2343, grad_norm: 2.4462
2023-02-19 09:40:20,340 - mmseg - INFO - Iter [70650/160000]	lr: 3.351e-05, eta: 7:05:34, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1491, decode.acc_seg: 93.8483, aux.loss_ce: 0.0851, aux.acc_seg: 91.1538, loss: 0.2342, grad_norm: 2.1796
2023-02-19 09:40:34,253 - mmseg - INFO - Iter [70700/160000]	lr: 3.349e-05, eta: 7:05:19, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1516, decode.acc_seg: 93.7392, aux.loss_ce: 0.0852, aux.acc_seg: 91.2450, loss: 0.2369, grad_norm: 2.0938
2023-02-19 09:40:50,801 - mmseg - INFO - Iter [70750/160000]	lr: 3.347e-05, eta: 7:05:07, time: 0.331, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1553, decode.acc_seg: 93.7074, aux.loss_ce: 0.0874, aux.acc_seg: 91.1989, loss: 0.2427, grad_norm: 2.3312
2023-02-19 09:41:05,097 - mmseg - INFO - Iter [70800/160000]	lr: 3.345e-05, eta: 7:04:53, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1462, decode.acc_seg: 93.8136, aux.loss_ce: 0.0858, aux.acc_seg: 91.3920, loss: 0.2320, grad_norm: 2.0197
2023-02-19 09:41:18,795 - mmseg - INFO - Iter [70850/160000]	lr: 3.343e-05, eta: 7:04:38, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1419, decode.acc_seg: 94.0030, aux.loss_ce: 0.0815, aux.acc_seg: 91.3326, loss: 0.2235, grad_norm: 1.8848
2023-02-19 09:41:32,504 - mmseg - INFO - Iter [70900/160000]	lr: 3.341e-05, eta: 7:04:23, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1497, decode.acc_seg: 93.6959, aux.loss_ce: 0.0841, aux.acc_seg: 91.2250, loss: 0.2338, grad_norm: 2.1127
2023-02-19 09:41:47,012 - mmseg - INFO - Iter [70950/160000]	lr: 3.339e-05, eta: 7:04:09, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1509, decode.acc_seg: 93.7189, aux.loss_ce: 0.0841, aux.acc_seg: 91.4636, loss: 0.2350, grad_norm: 1.9177
2023-02-19 09:42:01,242 - mmseg - INFO - Saving checkpoint at 71000 iterations
2023-02-19 09:42:04,477 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:42:04,478 - mmseg - INFO - Iter [71000/160000]	lr: 3.338e-05, eta: 7:03:59, time: 0.350, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1483, decode.acc_seg: 93.7653, aux.loss_ce: 0.0834, aux.acc_seg: 91.2862, loss: 0.2317, grad_norm: 2.0558
2023-02-19 09:42:18,511 - mmseg - INFO - Iter [71050/160000]	lr: 3.336e-05, eta: 7:03:44, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1542, decode.acc_seg: 93.6786, aux.loss_ce: 0.0858, aux.acc_seg: 91.1370, loss: 0.2400, grad_norm: 2.1741
2023-02-19 09:42:32,062 - mmseg - INFO - Iter [71100/160000]	lr: 3.334e-05, eta: 7:03:29, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1487, decode.acc_seg: 93.8284, aux.loss_ce: 0.0866, aux.acc_seg: 91.1600, loss: 0.2354, grad_norm: 1.9088
2023-02-19 09:42:45,655 - mmseg - INFO - Iter [71150/160000]	lr: 3.332e-05, eta: 7:03:14, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1422, decode.acc_seg: 94.0880, aux.loss_ce: 0.0828, aux.acc_seg: 91.3581, loss: 0.2250, grad_norm: 1.7951
2023-02-19 09:42:59,301 - mmseg - INFO - Iter [71200/160000]	lr: 3.330e-05, eta: 7:02:59, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1422, decode.acc_seg: 94.1564, aux.loss_ce: 0.0802, aux.acc_seg: 91.7704, loss: 0.2224, grad_norm: 2.1025
2023-02-19 09:43:13,359 - mmseg - INFO - Iter [71250/160000]	lr: 3.328e-05, eta: 7:02:44, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1501, decode.acc_seg: 93.7198, aux.loss_ce: 0.0841, aux.acc_seg: 91.2736, loss: 0.2342, grad_norm: 1.9904
2023-02-19 09:43:27,022 - mmseg - INFO - Iter [71300/160000]	lr: 3.326e-05, eta: 7:02:29, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1524, decode.acc_seg: 93.6800, aux.loss_ce: 0.0851, aux.acc_seg: 91.3986, loss: 0.2375, grad_norm: 2.4449
2023-02-19 09:43:40,748 - mmseg - INFO - Iter [71350/160000]	lr: 3.324e-05, eta: 7:02:14, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1506, decode.acc_seg: 93.7883, aux.loss_ce: 0.0853, aux.acc_seg: 91.2200, loss: 0.2359, grad_norm: 2.5384
2023-02-19 09:43:55,084 - mmseg - INFO - Iter [71400/160000]	lr: 3.323e-05, eta: 7:02:00, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1524, decode.acc_seg: 93.6186, aux.loss_ce: 0.0857, aux.acc_seg: 91.1982, loss: 0.2380, grad_norm: 2.4279
2023-02-19 09:44:09,452 - mmseg - INFO - Iter [71450/160000]	lr: 3.321e-05, eta: 7:01:46, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1584, decode.acc_seg: 93.3573, aux.loss_ce: 0.0912, aux.acc_seg: 90.4166, loss: 0.2496, grad_norm: 2.1508
2023-02-19 09:44:23,238 - mmseg - INFO - Iter [71500/160000]	lr: 3.319e-05, eta: 7:01:31, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1412, decode.acc_seg: 94.2858, aux.loss_ce: 0.0792, aux.acc_seg: 91.9427, loss: 0.2203, grad_norm: 2.3841
2023-02-19 09:44:36,826 - mmseg - INFO - Iter [71550/160000]	lr: 3.317e-05, eta: 7:01:16, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1508, decode.acc_seg: 93.8267, aux.loss_ce: 0.0870, aux.acc_seg: 91.2614, loss: 0.2378, grad_norm: 2.1505
2023-02-19 09:44:50,910 - mmseg - INFO - Iter [71600/160000]	lr: 3.315e-05, eta: 7:01:01, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1422, decode.acc_seg: 93.9673, aux.loss_ce: 0.0828, aux.acc_seg: 91.2441, loss: 0.2250, grad_norm: 2.6444
2023-02-19 09:45:04,791 - mmseg - INFO - Iter [71650/160000]	lr: 3.313e-05, eta: 7:00:46, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1431, decode.acc_seg: 94.0722, aux.loss_ce: 0.0800, aux.acc_seg: 91.8274, loss: 0.2231, grad_norm: 2.4481
2023-02-19 09:45:18,584 - mmseg - INFO - Iter [71700/160000]	lr: 3.311e-05, eta: 7:00:31, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1441, decode.acc_seg: 94.0506, aux.loss_ce: 0.0826, aux.acc_seg: 91.5399, loss: 0.2267, grad_norm: 2.3339
2023-02-19 09:45:33,214 - mmseg - INFO - Iter [71750/160000]	lr: 3.309e-05, eta: 7:00:17, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1413, decode.acc_seg: 93.9566, aux.loss_ce: 0.0830, aux.acc_seg: 91.2976, loss: 0.2243, grad_norm: 2.1981
2023-02-19 09:45:47,023 - mmseg - INFO - Iter [71800/160000]	lr: 3.308e-05, eta: 7:00:03, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1507, decode.acc_seg: 93.7303, aux.loss_ce: 0.0868, aux.acc_seg: 91.1735, loss: 0.2374, grad_norm: 2.1862
2023-02-19 09:46:01,641 - mmseg - INFO - Iter [71850/160000]	lr: 3.306e-05, eta: 6:59:49, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1462, decode.acc_seg: 93.9463, aux.loss_ce: 0.0827, aux.acc_seg: 91.4216, loss: 0.2289, grad_norm: 2.7654
2023-02-19 09:46:15,686 - mmseg - INFO - Iter [71900/160000]	lr: 3.304e-05, eta: 6:59:34, time: 0.282, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1512, decode.acc_seg: 93.8137, aux.loss_ce: 0.0868, aux.acc_seg: 91.2514, loss: 0.2380, grad_norm: 2.1159
2023-02-19 09:46:29,304 - mmseg - INFO - Iter [71950/160000]	lr: 3.302e-05, eta: 6:59:19, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1483, decode.acc_seg: 93.6861, aux.loss_ce: 0.0829, aux.acc_seg: 91.2760, loss: 0.2312, grad_norm: 1.7955
2023-02-19 09:46:45,159 - mmseg - INFO - Saving checkpoint at 72000 iterations
2023-02-19 09:46:48,460 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:46:48,460 - mmseg - INFO - Iter [72000/160000]	lr: 3.300e-05, eta: 6:59:11, time: 0.383, data_time: 0.046, memory: 15214, decode.loss_ce: 0.1556, decode.acc_seg: 93.4197, aux.loss_ce: 0.0867, aux.acc_seg: 91.0009, loss: 0.2423, grad_norm: 2.4765
2023-02-19 09:47:02,626 - mmseg - INFO - Iter [72050/160000]	lr: 3.298e-05, eta: 6:58:56, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1500, decode.acc_seg: 93.7051, aux.loss_ce: 0.0866, aux.acc_seg: 91.0339, loss: 0.2366, grad_norm: 2.1629
2023-02-19 09:47:16,363 - mmseg - INFO - Iter [72100/160000]	lr: 3.296e-05, eta: 6:58:41, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1419, decode.acc_seg: 94.0045, aux.loss_ce: 0.0841, aux.acc_seg: 91.2013, loss: 0.2260, grad_norm: 2.0137
2023-02-19 09:47:31,994 - mmseg - INFO - Iter [72150/160000]	lr: 3.294e-05, eta: 6:58:29, time: 0.313, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1494, decode.acc_seg: 93.6453, aux.loss_ce: 0.0840, aux.acc_seg: 91.2515, loss: 0.2334, grad_norm: 1.8275
2023-02-19 09:47:46,387 - mmseg - INFO - Iter [72200/160000]	lr: 3.293e-05, eta: 6:58:14, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1375, decode.acc_seg: 94.1809, aux.loss_ce: 0.0786, aux.acc_seg: 91.8198, loss: 0.2161, grad_norm: 1.5668
2023-02-19 09:48:00,114 - mmseg - INFO - Iter [72250/160000]	lr: 3.291e-05, eta: 6:57:59, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1467, decode.acc_seg: 93.8648, aux.loss_ce: 0.0841, aux.acc_seg: 91.3659, loss: 0.2308, grad_norm: 3.3819
2023-02-19 09:48:14,194 - mmseg - INFO - Iter [72300/160000]	lr: 3.289e-05, eta: 6:57:45, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1470, decode.acc_seg: 93.9751, aux.loss_ce: 0.0824, aux.acc_seg: 91.6483, loss: 0.2294, grad_norm: 2.3307
2023-02-19 09:48:27,974 - mmseg - INFO - Iter [72350/160000]	lr: 3.287e-05, eta: 6:57:30, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1554, decode.acc_seg: 93.7046, aux.loss_ce: 0.0873, aux.acc_seg: 91.1448, loss: 0.2427, grad_norm: 2.7182
2023-02-19 09:48:42,115 - mmseg - INFO - Iter [72400/160000]	lr: 3.285e-05, eta: 6:57:15, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1479, decode.acc_seg: 93.6919, aux.loss_ce: 0.0858, aux.acc_seg: 91.0891, loss: 0.2337, grad_norm: 2.5059
2023-02-19 09:48:56,764 - mmseg - INFO - Iter [72450/160000]	lr: 3.283e-05, eta: 6:57:02, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1467, decode.acc_seg: 93.8621, aux.loss_ce: 0.0822, aux.acc_seg: 91.4328, loss: 0.2289, grad_norm: 2.0990
2023-02-19 09:49:10,399 - mmseg - INFO - Iter [72500/160000]	lr: 3.281e-05, eta: 6:56:47, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1503, decode.acc_seg: 93.7384, aux.loss_ce: 0.0826, aux.acc_seg: 91.5194, loss: 0.2329, grad_norm: 2.1427
2023-02-19 09:49:24,704 - mmseg - INFO - Iter [72550/160000]	lr: 3.279e-05, eta: 6:56:32, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1417, decode.acc_seg: 94.2193, aux.loss_ce: 0.0826, aux.acc_seg: 91.6240, loss: 0.2243, grad_norm: 1.9358
2023-02-19 09:49:38,886 - mmseg - INFO - Iter [72600/160000]	lr: 3.278e-05, eta: 6:56:18, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1490, decode.acc_seg: 93.7538, aux.loss_ce: 0.0847, aux.acc_seg: 91.1179, loss: 0.2337, grad_norm: 1.8898
2023-02-19 09:49:52,513 - mmseg - INFO - Iter [72650/160000]	lr: 3.276e-05, eta: 6:56:03, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1425, decode.acc_seg: 94.0370, aux.loss_ce: 0.0806, aux.acc_seg: 91.6767, loss: 0.2231, grad_norm: 2.1417
2023-02-19 09:50:06,239 - mmseg - INFO - Iter [72700/160000]	lr: 3.274e-05, eta: 6:55:48, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1473, decode.acc_seg: 93.8737, aux.loss_ce: 0.0842, aux.acc_seg: 91.3710, loss: 0.2315, grad_norm: 1.9222
2023-02-19 09:50:20,096 - mmseg - INFO - Iter [72750/160000]	lr: 3.272e-05, eta: 6:55:33, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1538, decode.acc_seg: 93.6462, aux.loss_ce: 0.0866, aux.acc_seg: 91.2280, loss: 0.2404, grad_norm: 2.5673
2023-02-19 09:50:33,874 - mmseg - INFO - Iter [72800/160000]	lr: 3.270e-05, eta: 6:55:18, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1458, decode.acc_seg: 93.8210, aux.loss_ce: 0.0874, aux.acc_seg: 91.0342, loss: 0.2333, grad_norm: 2.3079
2023-02-19 09:50:47,555 - mmseg - INFO - Iter [72850/160000]	lr: 3.268e-05, eta: 6:55:03, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1455, decode.acc_seg: 93.8664, aux.loss_ce: 0.0821, aux.acc_seg: 91.4717, loss: 0.2276, grad_norm: 2.2432
2023-02-19 09:51:01,130 - mmseg - INFO - Iter [72900/160000]	lr: 3.266e-05, eta: 6:54:48, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1554, decode.acc_seg: 93.5889, aux.loss_ce: 0.0909, aux.acc_seg: 90.8004, loss: 0.2463, grad_norm: 2.3881
2023-02-19 09:51:15,227 - mmseg - INFO - Iter [72950/160000]	lr: 3.264e-05, eta: 6:54:33, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1447, decode.acc_seg: 94.0868, aux.loss_ce: 0.0824, aux.acc_seg: 91.4733, loss: 0.2272, grad_norm: 2.0556
2023-02-19 09:51:29,200 - mmseg - INFO - Saving checkpoint at 73000 iterations
2023-02-19 09:51:32,442 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:51:32,442 - mmseg - INFO - Iter [73000/160000]	lr: 3.263e-05, eta: 6:54:23, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1468, decode.acc_seg: 94.0321, aux.loss_ce: 0.0834, aux.acc_seg: 91.5941, loss: 0.2302, grad_norm: 2.1114
2023-02-19 09:51:46,342 - mmseg - INFO - Iter [73050/160000]	lr: 3.261e-05, eta: 6:54:08, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1507, decode.acc_seg: 93.6081, aux.loss_ce: 0.0851, aux.acc_seg: 91.1290, loss: 0.2358, grad_norm: 2.4060
2023-02-19 09:51:59,972 - mmseg - INFO - Iter [73100/160000]	lr: 3.259e-05, eta: 6:53:53, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1462, decode.acc_seg: 93.9841, aux.loss_ce: 0.0846, aux.acc_seg: 91.4506, loss: 0.2307, grad_norm: 2.1201
2023-02-19 09:52:13,915 - mmseg - INFO - Iter [73150/160000]	lr: 3.257e-05, eta: 6:53:38, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1404, decode.acc_seg: 94.0059, aux.loss_ce: 0.0815, aux.acc_seg: 91.4758, loss: 0.2219, grad_norm: 1.9080
2023-02-19 09:52:27,918 - mmseg - INFO - Iter [73200/160000]	lr: 3.255e-05, eta: 6:53:23, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1455, decode.acc_seg: 93.9939, aux.loss_ce: 0.0821, aux.acc_seg: 91.6392, loss: 0.2276, grad_norm: 2.0774
2023-02-19 09:52:42,404 - mmseg - INFO - Iter [73250/160000]	lr: 3.253e-05, eta: 6:53:09, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1511, decode.acc_seg: 93.7540, aux.loss_ce: 0.0886, aux.acc_seg: 91.0241, loss: 0.2397, grad_norm: 2.0227
2023-02-19 09:52:58,210 - mmseg - INFO - Iter [73300/160000]	lr: 3.251e-05, eta: 6:52:57, time: 0.317, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1315, decode.acc_seg: 94.3584, aux.loss_ce: 0.0802, aux.acc_seg: 91.5643, loss: 0.2117, grad_norm: 1.8673
2023-02-19 09:53:13,015 - mmseg - INFO - Iter [73350/160000]	lr: 3.249e-05, eta: 6:52:43, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1455, decode.acc_seg: 93.8985, aux.loss_ce: 0.0820, aux.acc_seg: 91.6570, loss: 0.2275, grad_norm: 2.2998
2023-02-19 09:53:26,607 - mmseg - INFO - Iter [73400/160000]	lr: 3.248e-05, eta: 6:52:28, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1448, decode.acc_seg: 93.9682, aux.loss_ce: 0.0810, aux.acc_seg: 91.5690, loss: 0.2258, grad_norm: 2.7432
2023-02-19 09:53:40,268 - mmseg - INFO - Iter [73450/160000]	lr: 3.246e-05, eta: 6:52:13, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1492, decode.acc_seg: 93.9241, aux.loss_ce: 0.0885, aux.acc_seg: 91.2613, loss: 0.2377, grad_norm: 2.4802
2023-02-19 09:53:53,963 - mmseg - INFO - Iter [73500/160000]	lr: 3.244e-05, eta: 6:51:58, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1517, decode.acc_seg: 93.6421, aux.loss_ce: 0.0838, aux.acc_seg: 91.3058, loss: 0.2355, grad_norm: 1.9861
2023-02-19 09:54:07,958 - mmseg - INFO - Iter [73550/160000]	lr: 3.242e-05, eta: 6:51:43, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1374, decode.acc_seg: 94.0508, aux.loss_ce: 0.0784, aux.acc_seg: 91.6403, loss: 0.2157, grad_norm: 2.0075
2023-02-19 09:54:22,020 - mmseg - INFO - Iter [73600/160000]	lr: 3.240e-05, eta: 6:51:29, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1467, decode.acc_seg: 93.8761, aux.loss_ce: 0.0814, aux.acc_seg: 91.4967, loss: 0.2280, grad_norm: 1.8135
2023-02-19 09:54:36,419 - mmseg - INFO - Iter [73650/160000]	lr: 3.238e-05, eta: 6:51:15, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1457, decode.acc_seg: 93.8779, aux.loss_ce: 0.0850, aux.acc_seg: 91.1346, loss: 0.2307, grad_norm: 2.6897
2023-02-19 09:54:50,342 - mmseg - INFO - Iter [73700/160000]	lr: 3.236e-05, eta: 6:51:00, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1409, decode.acc_seg: 94.0569, aux.loss_ce: 0.0817, aux.acc_seg: 91.5522, loss: 0.2226, grad_norm: 1.8122
2023-02-19 09:55:04,168 - mmseg - INFO - Iter [73750/160000]	lr: 3.234e-05, eta: 6:50:45, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1459, decode.acc_seg: 93.8586, aux.loss_ce: 0.0834, aux.acc_seg: 91.3654, loss: 0.2293, grad_norm: 2.2372
2023-02-19 09:55:17,891 - mmseg - INFO - Iter [73800/160000]	lr: 3.233e-05, eta: 6:50:30, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1448, decode.acc_seg: 93.8326, aux.loss_ce: 0.0839, aux.acc_seg: 91.1787, loss: 0.2286, grad_norm: 2.5584
2023-02-19 09:55:31,739 - mmseg - INFO - Iter [73850/160000]	lr: 3.231e-05, eta: 6:50:15, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1471, decode.acc_seg: 93.8779, aux.loss_ce: 0.0837, aux.acc_seg: 91.4065, loss: 0.2309, grad_norm: 2.3758
2023-02-19 09:55:45,665 - mmseg - INFO - Iter [73900/160000]	lr: 3.229e-05, eta: 6:50:01, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1392, decode.acc_seg: 94.1040, aux.loss_ce: 0.0797, aux.acc_seg: 91.7900, loss: 0.2189, grad_norm: 2.1719
2023-02-19 09:56:00,262 - mmseg - INFO - Iter [73950/160000]	lr: 3.227e-05, eta: 6:49:47, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1431, decode.acc_seg: 94.0218, aux.loss_ce: 0.0825, aux.acc_seg: 91.5189, loss: 0.2256, grad_norm: 1.8624
2023-02-19 09:56:13,954 - mmseg - INFO - Saving checkpoint at 74000 iterations
2023-02-19 09:56:17,273 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 09:56:17,273 - mmseg - INFO - Iter [74000/160000]	lr: 3.225e-05, eta: 6:49:36, time: 0.340, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1482, decode.acc_seg: 93.8094, aux.loss_ce: 0.0848, aux.acc_seg: 91.3003, loss: 0.2330, grad_norm: 2.7742
2023-02-19 09:56:30,877 - mmseg - INFO - Iter [74050/160000]	lr: 3.223e-05, eta: 6:49:21, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1379, decode.acc_seg: 94.2861, aux.loss_ce: 0.0815, aux.acc_seg: 91.5959, loss: 0.2194, grad_norm: 2.0323
2023-02-19 09:56:44,485 - mmseg - INFO - Iter [74100/160000]	lr: 3.221e-05, eta: 6:49:06, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1345, decode.acc_seg: 94.3732, aux.loss_ce: 0.0787, aux.acc_seg: 91.7359, loss: 0.2132, grad_norm: 1.7805
2023-02-19 09:56:58,145 - mmseg - INFO - Iter [74150/160000]	lr: 3.219e-05, eta: 6:48:51, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1450, decode.acc_seg: 93.8569, aux.loss_ce: 0.0832, aux.acc_seg: 91.2492, loss: 0.2281, grad_norm: 2.1016
2023-02-19 09:57:11,868 - mmseg - INFO - Iter [74200/160000]	lr: 3.218e-05, eta: 6:48:36, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1464, decode.acc_seg: 93.7887, aux.loss_ce: 0.0834, aux.acc_seg: 91.2981, loss: 0.2298, grad_norm: 2.2075
2023-02-19 09:57:25,863 - mmseg - INFO - Iter [74250/160000]	lr: 3.216e-05, eta: 6:48:21, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1396, decode.acc_seg: 94.1029, aux.loss_ce: 0.0790, aux.acc_seg: 91.6842, loss: 0.2186, grad_norm: 1.7968
2023-02-19 09:57:39,755 - mmseg - INFO - Iter [74300/160000]	lr: 3.214e-05, eta: 6:48:06, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1449, decode.acc_seg: 93.7580, aux.loss_ce: 0.0852, aux.acc_seg: 91.0202, loss: 0.2301, grad_norm: 2.3695
2023-02-19 09:57:53,649 - mmseg - INFO - Iter [74350/160000]	lr: 3.212e-05, eta: 6:47:51, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1433, decode.acc_seg: 94.1201, aux.loss_ce: 0.0847, aux.acc_seg: 91.2998, loss: 0.2280, grad_norm: 2.0820
2023-02-19 09:58:07,580 - mmseg - INFO - Iter [74400/160000]	lr: 3.210e-05, eta: 6:47:37, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1403, decode.acc_seg: 94.0474, aux.loss_ce: 0.0806, aux.acc_seg: 91.6221, loss: 0.2208, grad_norm: 2.3466
2023-02-19 09:58:21,300 - mmseg - INFO - Iter [74450/160000]	lr: 3.208e-05, eta: 6:47:22, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1508, decode.acc_seg: 93.6873, aux.loss_ce: 0.0836, aux.acc_seg: 91.2774, loss: 0.2344, grad_norm: 2.4812
2023-02-19 09:58:35,467 - mmseg - INFO - Iter [74500/160000]	lr: 3.206e-05, eta: 6:47:07, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1551, decode.acc_seg: 93.5355, aux.loss_ce: 0.0864, aux.acc_seg: 90.9517, loss: 0.2414, grad_norm: 2.0266
2023-02-19 09:58:51,897 - mmseg - INFO - Iter [74550/160000]	lr: 3.204e-05, eta: 6:46:56, time: 0.329, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1418, decode.acc_seg: 94.1013, aux.loss_ce: 0.0809, aux.acc_seg: 91.7121, loss: 0.2227, grad_norm: 2.1314
2023-02-19 09:59:05,609 - mmseg - INFO - Iter [74600/160000]	lr: 3.203e-05, eta: 6:46:41, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1421, decode.acc_seg: 94.0912, aux.loss_ce: 0.0842, aux.acc_seg: 91.3881, loss: 0.2263, grad_norm: 2.3021
2023-02-19 09:59:19,316 - mmseg - INFO - Iter [74650/160000]	lr: 3.201e-05, eta: 6:46:26, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1504, decode.acc_seg: 93.7527, aux.loss_ce: 0.0862, aux.acc_seg: 91.2808, loss: 0.2366, grad_norm: 2.0189
2023-02-19 09:59:34,175 - mmseg - INFO - Iter [74700/160000]	lr: 3.199e-05, eta: 6:46:12, time: 0.297, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1426, decode.acc_seg: 94.0161, aux.loss_ce: 0.0818, aux.acc_seg: 91.4361, loss: 0.2243, grad_norm: 2.0517
2023-02-19 09:59:48,816 - mmseg - INFO - Iter [74750/160000]	lr: 3.197e-05, eta: 6:45:58, time: 0.292, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1452, decode.acc_seg: 94.0524, aux.loss_ce: 0.0822, aux.acc_seg: 91.6820, loss: 0.2274, grad_norm: 2.1609
2023-02-19 10:00:03,451 - mmseg - INFO - Iter [74800/160000]	lr: 3.195e-05, eta: 6:45:44, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1427, decode.acc_seg: 94.1283, aux.loss_ce: 0.0838, aux.acc_seg: 91.4064, loss: 0.2265, grad_norm: 1.9036
2023-02-19 10:00:17,396 - mmseg - INFO - Iter [74850/160000]	lr: 3.193e-05, eta: 6:45:30, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1408, decode.acc_seg: 94.2664, aux.loss_ce: 0.0805, aux.acc_seg: 91.7653, loss: 0.2213, grad_norm: 2.2332
2023-02-19 10:00:31,106 - mmseg - INFO - Iter [74900/160000]	lr: 3.191e-05, eta: 6:45:15, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1507, decode.acc_seg: 93.6195, aux.loss_ce: 0.0866, aux.acc_seg: 91.0000, loss: 0.2373, grad_norm: 3.3552
2023-02-19 10:00:44,780 - mmseg - INFO - Iter [74950/160000]	lr: 3.189e-05, eta: 6:45:00, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1404, decode.acc_seg: 94.1352, aux.loss_ce: 0.0829, aux.acc_seg: 91.5245, loss: 0.2233, grad_norm: 2.2230
2023-02-19 10:00:58,401 - mmseg - INFO - Saving checkpoint at 75000 iterations
2023-02-19 10:01:01,700 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:01:01,700 - mmseg - INFO - Iter [75000/160000]	lr: 3.188e-05, eta: 6:44:48, time: 0.339, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1542, decode.acc_seg: 93.5466, aux.loss_ce: 0.0852, aux.acc_seg: 91.3059, loss: 0.2394, grad_norm: 2.1834
2023-02-19 10:01:15,693 - mmseg - INFO - Iter [75050/160000]	lr: 3.186e-05, eta: 6:44:34, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1338, decode.acc_seg: 94.2991, aux.loss_ce: 0.0797, aux.acc_seg: 91.6230, loss: 0.2135, grad_norm: 1.9119
2023-02-19 10:01:29,490 - mmseg - INFO - Iter [75100/160000]	lr: 3.184e-05, eta: 6:44:19, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1360, decode.acc_seg: 94.2567, aux.loss_ce: 0.0805, aux.acc_seg: 91.5082, loss: 0.2165, grad_norm: 2.7072
2023-02-19 10:01:44,076 - mmseg - INFO - Iter [75150/160000]	lr: 3.182e-05, eta: 6:44:05, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1451, decode.acc_seg: 93.9078, aux.loss_ce: 0.0812, aux.acc_seg: 91.6576, loss: 0.2264, grad_norm: 1.8327
2023-02-19 10:01:57,809 - mmseg - INFO - Iter [75200/160000]	lr: 3.180e-05, eta: 6:43:50, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1405, decode.acc_seg: 94.3017, aux.loss_ce: 0.0797, aux.acc_seg: 91.8790, loss: 0.2203, grad_norm: 1.6131
2023-02-19 10:02:12,516 - mmseg - INFO - Iter [75250/160000]	lr: 3.178e-05, eta: 6:43:36, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1430, decode.acc_seg: 93.9556, aux.loss_ce: 0.0818, aux.acc_seg: 91.4240, loss: 0.2248, grad_norm: 1.8909
2023-02-19 10:02:26,237 - mmseg - INFO - Iter [75300/160000]	lr: 3.176e-05, eta: 6:43:21, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1425, decode.acc_seg: 94.0165, aux.loss_ce: 0.0796, aux.acc_seg: 91.8531, loss: 0.2222, grad_norm: 2.2532
2023-02-19 10:02:39,993 - mmseg - INFO - Iter [75350/160000]	lr: 3.174e-05, eta: 6:43:06, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1369, decode.acc_seg: 94.1442, aux.loss_ce: 0.0827, aux.acc_seg: 91.5509, loss: 0.2196, grad_norm: 2.2009
2023-02-19 10:02:53,566 - mmseg - INFO - Iter [75400/160000]	lr: 3.173e-05, eta: 6:42:51, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1339, decode.acc_seg: 94.3274, aux.loss_ce: 0.0771, aux.acc_seg: 91.9119, loss: 0.2110, grad_norm: 1.7956
2023-02-19 10:03:07,319 - mmseg - INFO - Iter [75450/160000]	lr: 3.171e-05, eta: 6:42:37, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1496, decode.acc_seg: 93.8416, aux.loss_ce: 0.0838, aux.acc_seg: 91.3832, loss: 0.2334, grad_norm: 2.3166
2023-02-19 10:03:20,947 - mmseg - INFO - Iter [75500/160000]	lr: 3.169e-05, eta: 6:42:22, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1450, decode.acc_seg: 93.8996, aux.loss_ce: 0.0838, aux.acc_seg: 91.3239, loss: 0.2288, grad_norm: 2.6672
2023-02-19 10:03:35,333 - mmseg - INFO - Iter [75550/160000]	lr: 3.167e-05, eta: 6:42:07, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1418, decode.acc_seg: 94.1110, aux.loss_ce: 0.0828, aux.acc_seg: 91.5976, loss: 0.2246, grad_norm: 2.6507
2023-02-19 10:03:49,657 - mmseg - INFO - Iter [75600/160000]	lr: 3.165e-05, eta: 6:41:53, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1477, decode.acc_seg: 93.7577, aux.loss_ce: 0.0864, aux.acc_seg: 90.9361, loss: 0.2341, grad_norm: 2.1887
2023-02-19 10:04:03,706 - mmseg - INFO - Iter [75650/160000]	lr: 3.163e-05, eta: 6:41:39, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1493, decode.acc_seg: 93.8196, aux.loss_ce: 0.0841, aux.acc_seg: 91.2431, loss: 0.2334, grad_norm: 2.0354
2023-02-19 10:04:17,242 - mmseg - INFO - Iter [75700/160000]	lr: 3.161e-05, eta: 6:41:23, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1457, decode.acc_seg: 93.8541, aux.loss_ce: 0.0827, aux.acc_seg: 91.4303, loss: 0.2284, grad_norm: 2.5939
2023-02-19 10:04:30,846 - mmseg - INFO - Iter [75750/160000]	lr: 3.159e-05, eta: 6:41:08, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1400, decode.acc_seg: 94.0761, aux.loss_ce: 0.0797, aux.acc_seg: 91.6395, loss: 0.2197, grad_norm: 1.9812
2023-02-19 10:04:46,951 - mmseg - INFO - Iter [75800/160000]	lr: 3.158e-05, eta: 6:40:56, time: 0.322, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1468, decode.acc_seg: 93.8742, aux.loss_ce: 0.0823, aux.acc_seg: 91.4386, loss: 0.2291, grad_norm: 1.9636
2023-02-19 10:05:00,777 - mmseg - INFO - Iter [75850/160000]	lr: 3.156e-05, eta: 6:40:41, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1409, decode.acc_seg: 94.1829, aux.loss_ce: 0.0834, aux.acc_seg: 91.4381, loss: 0.2243, grad_norm: 2.1164
2023-02-19 10:05:14,484 - mmseg - INFO - Iter [75900/160000]	lr: 3.154e-05, eta: 6:40:26, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1310, decode.acc_seg: 94.4348, aux.loss_ce: 0.0751, aux.acc_seg: 92.1543, loss: 0.2060, grad_norm: 1.7216
2023-02-19 10:05:28,478 - mmseg - INFO - Iter [75950/160000]	lr: 3.152e-05, eta: 6:40:12, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1391, decode.acc_seg: 94.2467, aux.loss_ce: 0.0796, aux.acc_seg: 91.5529, loss: 0.2186, grad_norm: 3.4592
2023-02-19 10:05:42,668 - mmseg - INFO - Saving checkpoint at 76000 iterations
2023-02-19 10:05:45,996 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:05:45,996 - mmseg - INFO - Iter [76000/160000]	lr: 3.150e-05, eta: 6:40:01, time: 0.350, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1461, decode.acc_seg: 93.8445, aux.loss_ce: 0.0821, aux.acc_seg: 91.6627, loss: 0.2282, grad_norm: 2.0278
2023-02-19 10:05:59,737 - mmseg - INFO - Iter [76050/160000]	lr: 3.148e-05, eta: 6:39:46, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1399, decode.acc_seg: 94.1157, aux.loss_ce: 0.0811, aux.acc_seg: 91.6000, loss: 0.2210, grad_norm: 1.7630
2023-02-19 10:06:13,493 - mmseg - INFO - Iter [76100/160000]	lr: 3.146e-05, eta: 6:39:31, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1458, decode.acc_seg: 93.8568, aux.loss_ce: 0.0821, aux.acc_seg: 91.4682, loss: 0.2279, grad_norm: 2.1284
2023-02-19 10:06:27,553 - mmseg - INFO - Iter [76150/160000]	lr: 3.144e-05, eta: 6:39:17, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1408, decode.acc_seg: 94.0661, aux.loss_ce: 0.0802, aux.acc_seg: 91.5971, loss: 0.2210, grad_norm: 2.0846
2023-02-19 10:06:41,527 - mmseg - INFO - Iter [76200/160000]	lr: 3.143e-05, eta: 6:39:02, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1488, decode.acc_seg: 93.7945, aux.loss_ce: 0.0856, aux.acc_seg: 91.1276, loss: 0.2345, grad_norm: 2.0682
2023-02-19 10:06:55,826 - mmseg - INFO - Iter [76250/160000]	lr: 3.141e-05, eta: 6:38:48, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1470, decode.acc_seg: 93.9422, aux.loss_ce: 0.0837, aux.acc_seg: 91.4198, loss: 0.2307, grad_norm: 2.1256
2023-02-19 10:07:09,453 - mmseg - INFO - Iter [76300/160000]	lr: 3.139e-05, eta: 6:38:33, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1419, decode.acc_seg: 94.1475, aux.loss_ce: 0.0821, aux.acc_seg: 91.5677, loss: 0.2241, grad_norm: 1.6812
2023-02-19 10:07:23,773 - mmseg - INFO - Iter [76350/160000]	lr: 3.137e-05, eta: 6:38:19, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1380, decode.acc_seg: 94.1173, aux.loss_ce: 0.0777, aux.acc_seg: 91.7986, loss: 0.2157, grad_norm: 2.1150
2023-02-19 10:07:38,166 - mmseg - INFO - Iter [76400/160000]	lr: 3.135e-05, eta: 6:38:04, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1414, decode.acc_seg: 94.0326, aux.loss_ce: 0.0836, aux.acc_seg: 91.3969, loss: 0.2251, grad_norm: 2.0132
2023-02-19 10:07:52,147 - mmseg - INFO - Iter [76450/160000]	lr: 3.133e-05, eta: 6:37:50, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1421, decode.acc_seg: 94.0983, aux.loss_ce: 0.0808, aux.acc_seg: 91.7829, loss: 0.2229, grad_norm: 2.0774
2023-02-19 10:08:05,766 - mmseg - INFO - Iter [76500/160000]	lr: 3.131e-05, eta: 6:37:35, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1368, decode.acc_seg: 94.2517, aux.loss_ce: 0.0787, aux.acc_seg: 91.6416, loss: 0.2155, grad_norm: 1.6915
2023-02-19 10:08:19,794 - mmseg - INFO - Iter [76550/160000]	lr: 3.129e-05, eta: 6:37:20, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1515, decode.acc_seg: 93.8724, aux.loss_ce: 0.0866, aux.acc_seg: 91.2218, loss: 0.2382, grad_norm: 2.1294
2023-02-19 10:08:33,809 - mmseg - INFO - Iter [76600/160000]	lr: 3.128e-05, eta: 6:37:06, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1504, decode.acc_seg: 93.9553, aux.loss_ce: 0.0848, aux.acc_seg: 91.4644, loss: 0.2352, grad_norm: 2.5135
2023-02-19 10:08:47,709 - mmseg - INFO - Iter [76650/160000]	lr: 3.126e-05, eta: 6:36:51, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1440, decode.acc_seg: 94.0976, aux.loss_ce: 0.0835, aux.acc_seg: 91.5451, loss: 0.2275, grad_norm: 2.4209
2023-02-19 10:09:01,293 - mmseg - INFO - Iter [76700/160000]	lr: 3.124e-05, eta: 6:36:36, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1458, decode.acc_seg: 93.8355, aux.loss_ce: 0.0839, aux.acc_seg: 91.2650, loss: 0.2297, grad_norm: 2.2453
2023-02-19 10:09:15,257 - mmseg - INFO - Iter [76750/160000]	lr: 3.122e-05, eta: 6:36:21, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1356, decode.acc_seg: 94.2727, aux.loss_ce: 0.0781, aux.acc_seg: 91.8901, loss: 0.2138, grad_norm: 2.4655
2023-02-19 10:09:29,449 - mmseg - INFO - Iter [76800/160000]	lr: 3.120e-05, eta: 6:36:07, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1392, decode.acc_seg: 94.0350, aux.loss_ce: 0.0789, aux.acc_seg: 91.6911, loss: 0.2180, grad_norm: 1.9814
2023-02-19 10:09:43,731 - mmseg - INFO - Iter [76850/160000]	lr: 3.118e-05, eta: 6:35:53, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1413, decode.acc_seg: 93.9588, aux.loss_ce: 0.0826, aux.acc_seg: 91.2011, loss: 0.2239, grad_norm: 1.9948
2023-02-19 10:09:58,016 - mmseg - INFO - Iter [76900/160000]	lr: 3.116e-05, eta: 6:35:38, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1397, decode.acc_seg: 94.2787, aux.loss_ce: 0.0781, aux.acc_seg: 92.0197, loss: 0.2177, grad_norm: 1.8096
2023-02-19 10:10:11,845 - mmseg - INFO - Iter [76950/160000]	lr: 3.114e-05, eta: 6:35:24, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1354, decode.acc_seg: 94.4373, aux.loss_ce: 0.0779, aux.acc_seg: 91.9330, loss: 0.2132, grad_norm: 1.9032
2023-02-19 10:10:25,401 - mmseg - INFO - Saving checkpoint at 77000 iterations
2023-02-19 10:10:28,722 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:10:28,722 - mmseg - INFO - Iter [77000/160000]	lr: 3.113e-05, eta: 6:35:12, time: 0.338, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1329, decode.acc_seg: 94.2926, aux.loss_ce: 0.0752, aux.acc_seg: 92.0102, loss: 0.2081, grad_norm: 1.7147
2023-02-19 10:10:44,680 - mmseg - INFO - Iter [77050/160000]	lr: 3.111e-05, eta: 6:35:00, time: 0.319, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1391, decode.acc_seg: 94.1057, aux.loss_ce: 0.0801, aux.acc_seg: 91.5599, loss: 0.2192, grad_norm: 1.9183
2023-02-19 10:10:58,626 - mmseg - INFO - Iter [77100/160000]	lr: 3.109e-05, eta: 6:34:45, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1442, decode.acc_seg: 93.9539, aux.loss_ce: 0.0844, aux.acc_seg: 91.2775, loss: 0.2286, grad_norm: 2.0793
2023-02-19 10:11:12,534 - mmseg - INFO - Iter [77150/160000]	lr: 3.107e-05, eta: 6:34:30, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1456, decode.acc_seg: 93.8354, aux.loss_ce: 0.0831, aux.acc_seg: 91.3033, loss: 0.2287, grad_norm: 2.2996
2023-02-19 10:11:26,957 - mmseg - INFO - Iter [77200/160000]	lr: 3.105e-05, eta: 6:34:16, time: 0.289, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1357, decode.acc_seg: 94.2695, aux.loss_ce: 0.0806, aux.acc_seg: 91.6808, loss: 0.2162, grad_norm: 1.9334
2023-02-19 10:11:41,346 - mmseg - INFO - Iter [77250/160000]	lr: 3.103e-05, eta: 6:34:02, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1367, decode.acc_seg: 94.2785, aux.loss_ce: 0.0799, aux.acc_seg: 91.7733, loss: 0.2165, grad_norm: 1.7039
2023-02-19 10:11:55,399 - mmseg - INFO - Iter [77300/160000]	lr: 3.101e-05, eta: 6:33:47, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1307, decode.acc_seg: 94.5419, aux.loss_ce: 0.0766, aux.acc_seg: 92.1066, loss: 0.2072, grad_norm: 1.8451
2023-02-19 10:12:09,099 - mmseg - INFO - Iter [77350/160000]	lr: 3.099e-05, eta: 6:33:33, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1329, decode.acc_seg: 94.4243, aux.loss_ce: 0.0782, aux.acc_seg: 91.8454, loss: 0.2111, grad_norm: 1.7967
2023-02-19 10:12:22,956 - mmseg - INFO - Iter [77400/160000]	lr: 3.098e-05, eta: 6:33:18, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1515, decode.acc_seg: 93.7896, aux.loss_ce: 0.0840, aux.acc_seg: 91.4589, loss: 0.2355, grad_norm: 2.4841
2023-02-19 10:12:37,257 - mmseg - INFO - Iter [77450/160000]	lr: 3.096e-05, eta: 6:33:04, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1552, decode.acc_seg: 93.6019, aux.loss_ce: 0.0884, aux.acc_seg: 90.8397, loss: 0.2435, grad_norm: 2.6517
2023-02-19 10:12:50,840 - mmseg - INFO - Iter [77500/160000]	lr: 3.094e-05, eta: 6:32:48, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1453, decode.acc_seg: 93.9660, aux.loss_ce: 0.0833, aux.acc_seg: 91.5327, loss: 0.2285, grad_norm: 2.1752
2023-02-19 10:13:04,536 - mmseg - INFO - Iter [77550/160000]	lr: 3.092e-05, eta: 6:32:34, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1342, decode.acc_seg: 94.4358, aux.loss_ce: 0.0768, aux.acc_seg: 92.1812, loss: 0.2110, grad_norm: 1.7944
2023-02-19 10:13:18,702 - mmseg - INFO - Iter [77600/160000]	lr: 3.090e-05, eta: 6:32:19, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1460, decode.acc_seg: 93.8731, aux.loss_ce: 0.0840, aux.acc_seg: 91.3267, loss: 0.2300, grad_norm: 2.2501
2023-02-19 10:13:32,233 - mmseg - INFO - Iter [77650/160000]	lr: 3.088e-05, eta: 6:32:04, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1416, decode.acc_seg: 94.2469, aux.loss_ce: 0.0802, aux.acc_seg: 91.8131, loss: 0.2217, grad_norm: 1.7655
2023-02-19 10:13:46,267 - mmseg - INFO - Iter [77700/160000]	lr: 3.086e-05, eta: 6:31:50, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1309, decode.acc_seg: 94.4692, aux.loss_ce: 0.0753, aux.acc_seg: 92.0518, loss: 0.2061, grad_norm: 1.7764
2023-02-19 10:14:00,194 - mmseg - INFO - Iter [77750/160000]	lr: 3.084e-05, eta: 6:31:35, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1392, decode.acc_seg: 94.2758, aux.loss_ce: 0.0826, aux.acc_seg: 91.6197, loss: 0.2218, grad_norm: 2.1702
2023-02-19 10:14:14,128 - mmseg - INFO - Iter [77800/160000]	lr: 3.083e-05, eta: 6:31:20, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1364, decode.acc_seg: 94.3024, aux.loss_ce: 0.0789, aux.acc_seg: 91.7600, loss: 0.2153, grad_norm: 1.6970
2023-02-19 10:14:28,576 - mmseg - INFO - Iter [77850/160000]	lr: 3.081e-05, eta: 6:31:06, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1402, decode.acc_seg: 94.1102, aux.loss_ce: 0.0819, aux.acc_seg: 91.4616, loss: 0.2221, grad_norm: 2.2888
2023-02-19 10:14:42,751 - mmseg - INFO - Iter [77900/160000]	lr: 3.079e-05, eta: 6:30:52, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1336, decode.acc_seg: 94.4054, aux.loss_ce: 0.0787, aux.acc_seg: 91.7276, loss: 0.2123, grad_norm: 1.6359
2023-02-19 10:14:57,032 - mmseg - INFO - Iter [77950/160000]	lr: 3.077e-05, eta: 6:30:37, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1330, decode.acc_seg: 94.3086, aux.loss_ce: 0.0793, aux.acc_seg: 91.6566, loss: 0.2123, grad_norm: 2.3374
2023-02-19 10:15:11,521 - mmseg - INFO - Saving checkpoint at 78000 iterations
2023-02-19 10:15:14,783 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:15:14,783 - mmseg - INFO - Iter [78000/160000]	lr: 3.075e-05, eta: 6:30:27, time: 0.355, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1417, decode.acc_seg: 93.9882, aux.loss_ce: 0.0791, aux.acc_seg: 91.7974, loss: 0.2208, grad_norm: 1.7976
2023-02-19 10:15:28,730 - mmseg - INFO - Iter [78050/160000]	lr: 3.073e-05, eta: 6:30:12, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1521, decode.acc_seg: 93.6552, aux.loss_ce: 0.0862, aux.acc_seg: 91.0468, loss: 0.2383, grad_norm: 3.4918
2023-02-19 10:15:42,383 - mmseg - INFO - Iter [78100/160000]	lr: 3.071e-05, eta: 6:29:57, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1415, decode.acc_seg: 94.0053, aux.loss_ce: 0.0848, aux.acc_seg: 91.2003, loss: 0.2262, grad_norm: 2.5287
2023-02-19 10:15:56,320 - mmseg - INFO - Iter [78150/160000]	lr: 3.069e-05, eta: 6:29:43, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1459, decode.acc_seg: 93.9505, aux.loss_ce: 0.0817, aux.acc_seg: 91.5762, loss: 0.2275, grad_norm: 1.7042
2023-02-19 10:16:10,794 - mmseg - INFO - Iter [78200/160000]	lr: 3.068e-05, eta: 6:29:28, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1372, decode.acc_seg: 94.2487, aux.loss_ce: 0.0790, aux.acc_seg: 91.7271, loss: 0.2163, grad_norm: 1.5974
2023-02-19 10:16:24,477 - mmseg - INFO - Iter [78250/160000]	lr: 3.066e-05, eta: 6:29:14, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1519, decode.acc_seg: 93.6587, aux.loss_ce: 0.0842, aux.acc_seg: 91.3719, loss: 0.2361, grad_norm: 1.9654
2023-02-19 10:16:37,991 - mmseg - INFO - Iter [78300/160000]	lr: 3.064e-05, eta: 6:28:58, time: 0.270, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1392, decode.acc_seg: 94.1461, aux.loss_ce: 0.0778, aux.acc_seg: 91.9113, loss: 0.2170, grad_norm: 1.9684
2023-02-19 10:16:54,188 - mmseg - INFO - Iter [78350/160000]	lr: 3.062e-05, eta: 6:28:46, time: 0.324, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1359, decode.acc_seg: 93.9629, aux.loss_ce: 0.0797, aux.acc_seg: 91.4693, loss: 0.2156, grad_norm: 1.9856
2023-02-19 10:17:07,894 - mmseg - INFO - Iter [78400/160000]	lr: 3.060e-05, eta: 6:28:31, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1341, decode.acc_seg: 94.3591, aux.loss_ce: 0.0794, aux.acc_seg: 91.8037, loss: 0.2135, grad_norm: 1.9382
2023-02-19 10:17:21,758 - mmseg - INFO - Iter [78450/160000]	lr: 3.058e-05, eta: 6:28:17, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1416, decode.acc_seg: 94.2045, aux.loss_ce: 0.0799, aux.acc_seg: 91.8967, loss: 0.2215, grad_norm: 1.8033
2023-02-19 10:17:35,348 - mmseg - INFO - Iter [78500/160000]	lr: 3.056e-05, eta: 6:28:02, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1406, decode.acc_seg: 94.1115, aux.loss_ce: 0.0807, aux.acc_seg: 91.5499, loss: 0.2213, grad_norm: 1.7851
2023-02-19 10:17:49,000 - mmseg - INFO - Iter [78550/160000]	lr: 3.054e-05, eta: 6:27:47, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1387, decode.acc_seg: 94.2160, aux.loss_ce: 0.0828, aux.acc_seg: 91.4722, loss: 0.2215, grad_norm: 1.7953
2023-02-19 10:18:02,986 - mmseg - INFO - Iter [78600/160000]	lr: 3.053e-05, eta: 6:27:32, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1416, decode.acc_seg: 94.1223, aux.loss_ce: 0.0798, aux.acc_seg: 91.6559, loss: 0.2215, grad_norm: 1.9838
2023-02-19 10:18:16,627 - mmseg - INFO - Iter [78650/160000]	lr: 3.051e-05, eta: 6:27:17, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1406, decode.acc_seg: 94.1829, aux.loss_ce: 0.0804, aux.acc_seg: 91.7621, loss: 0.2209, grad_norm: 1.7621
2023-02-19 10:18:30,351 - mmseg - INFO - Iter [78700/160000]	lr: 3.049e-05, eta: 6:27:02, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1388, decode.acc_seg: 94.0608, aux.loss_ce: 0.0782, aux.acc_seg: 91.6923, loss: 0.2171, grad_norm: 1.8977
2023-02-19 10:18:44,661 - mmseg - INFO - Iter [78750/160000]	lr: 3.047e-05, eta: 6:26:48, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1405, decode.acc_seg: 93.9667, aux.loss_ce: 0.0847, aux.acc_seg: 91.3709, loss: 0.2252, grad_norm: 2.7283
2023-02-19 10:18:59,049 - mmseg - INFO - Iter [78800/160000]	lr: 3.045e-05, eta: 6:26:34, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1401, decode.acc_seg: 93.9542, aux.loss_ce: 0.0825, aux.acc_seg: 91.3126, loss: 0.2226, grad_norm: 2.0815
2023-02-19 10:19:12,706 - mmseg - INFO - Iter [78850/160000]	lr: 3.043e-05, eta: 6:26:19, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1291, decode.acc_seg: 94.4389, aux.loss_ce: 0.0762, aux.acc_seg: 91.7514, loss: 0.2053, grad_norm: 1.5959
2023-02-19 10:19:26,767 - mmseg - INFO - Iter [78900/160000]	lr: 3.041e-05, eta: 6:26:04, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1284, decode.acc_seg: 94.4530, aux.loss_ce: 0.0771, aux.acc_seg: 91.7602, loss: 0.2055, grad_norm: 1.8958
2023-02-19 10:19:41,193 - mmseg - INFO - Iter [78950/160000]	lr: 3.039e-05, eta: 6:25:50, time: 0.289, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1344, decode.acc_seg: 94.3198, aux.loss_ce: 0.0776, aux.acc_seg: 91.8858, loss: 0.2120, grad_norm: 1.7183
2023-02-19 10:19:55,905 - mmseg - INFO - Saving checkpoint at 79000 iterations
2023-02-19 10:19:59,136 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:19:59,136 - mmseg - INFO - Iter [79000/160000]	lr: 3.038e-05, eta: 6:25:40, time: 0.359, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1313, decode.acc_seg: 94.3053, aux.loss_ce: 0.0767, aux.acc_seg: 91.7678, loss: 0.2079, grad_norm: 1.7774
2023-02-19 10:20:13,320 - mmseg - INFO - Iter [79050/160000]	lr: 3.036e-05, eta: 6:25:25, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1324, decode.acc_seg: 94.3820, aux.loss_ce: 0.0793, aux.acc_seg: 91.6532, loss: 0.2117, grad_norm: 1.6710
2023-02-19 10:20:27,942 - mmseg - INFO - Iter [79100/160000]	lr: 3.034e-05, eta: 6:25:11, time: 0.293, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1380, decode.acc_seg: 94.1630, aux.loss_ce: 0.0846, aux.acc_seg: 91.2252, loss: 0.2226, grad_norm: 2.4103
2023-02-19 10:20:42,464 - mmseg - INFO - Iter [79150/160000]	lr: 3.032e-05, eta: 6:24:57, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1450, decode.acc_seg: 93.9438, aux.loss_ce: 0.0836, aux.acc_seg: 91.3480, loss: 0.2287, grad_norm: 2.1811
2023-02-19 10:20:56,058 - mmseg - INFO - Iter [79200/160000]	lr: 3.030e-05, eta: 6:24:42, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1402, decode.acc_seg: 94.1682, aux.loss_ce: 0.0807, aux.acc_seg: 91.6350, loss: 0.2210, grad_norm: 1.9680
2023-02-19 10:21:10,286 - mmseg - INFO - Iter [79250/160000]	lr: 3.028e-05, eta: 6:24:28, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1352, decode.acc_seg: 94.2270, aux.loss_ce: 0.0765, aux.acc_seg: 91.8528, loss: 0.2118, grad_norm: 2.1970
2023-02-19 10:21:24,104 - mmseg - INFO - Iter [79300/160000]	lr: 3.026e-05, eta: 6:24:13, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1418, decode.acc_seg: 94.0650, aux.loss_ce: 0.0823, aux.acc_seg: 91.4472, loss: 0.2241, grad_norm: 1.9553
2023-02-19 10:21:38,175 - mmseg - INFO - Iter [79350/160000]	lr: 3.024e-05, eta: 6:23:59, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1345, decode.acc_seg: 94.3434, aux.loss_ce: 0.0766, aux.acc_seg: 91.9922, loss: 0.2111, grad_norm: 1.8020
2023-02-19 10:21:52,227 - mmseg - INFO - Iter [79400/160000]	lr: 3.023e-05, eta: 6:23:44, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1384, decode.acc_seg: 94.1089, aux.loss_ce: 0.0806, aux.acc_seg: 91.5160, loss: 0.2190, grad_norm: 1.7330
2023-02-19 10:22:06,127 - mmseg - INFO - Iter [79450/160000]	lr: 3.021e-05, eta: 6:23:30, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1514, decode.acc_seg: 93.6742, aux.loss_ce: 0.0874, aux.acc_seg: 91.0571, loss: 0.2388, grad_norm: 1.9220
2023-02-19 10:22:20,404 - mmseg - INFO - Iter [79500/160000]	lr: 3.019e-05, eta: 6:23:15, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1392, decode.acc_seg: 94.2137, aux.loss_ce: 0.0815, aux.acc_seg: 91.4839, loss: 0.2207, grad_norm: 1.9298
2023-02-19 10:22:34,412 - mmseg - INFO - Iter [79550/160000]	lr: 3.017e-05, eta: 6:23:01, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1425, decode.acc_seg: 93.9160, aux.loss_ce: 0.0830, aux.acc_seg: 91.3860, loss: 0.2255, grad_norm: 2.0393
2023-02-19 10:22:51,030 - mmseg - INFO - Iter [79600/160000]	lr: 3.015e-05, eta: 6:22:49, time: 0.332, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1312, decode.acc_seg: 94.6085, aux.loss_ce: 0.0794, aux.acc_seg: 91.9759, loss: 0.2105, grad_norm: 1.6442
2023-02-19 10:23:04,587 - mmseg - INFO - Iter [79650/160000]	lr: 3.013e-05, eta: 6:22:34, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1284, decode.acc_seg: 94.5663, aux.loss_ce: 0.0779, aux.acc_seg: 91.8842, loss: 0.2063, grad_norm: 1.7657
2023-02-19 10:23:18,161 - mmseg - INFO - Iter [79700/160000]	lr: 3.011e-05, eta: 6:22:19, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1362, decode.acc_seg: 94.2578, aux.loss_ce: 0.0785, aux.acc_seg: 91.8709, loss: 0.2147, grad_norm: 2.0081
2023-02-19 10:23:31,911 - mmseg - INFO - Iter [79750/160000]	lr: 3.009e-05, eta: 6:22:04, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1415, decode.acc_seg: 94.2406, aux.loss_ce: 0.0790, aux.acc_seg: 91.9997, loss: 0.2205, grad_norm: 1.8508
2023-02-19 10:23:45,699 - mmseg - INFO - Iter [79800/160000]	lr: 3.008e-05, eta: 6:21:49, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1334, decode.acc_seg: 94.4451, aux.loss_ce: 0.0772, aux.acc_seg: 92.1101, loss: 0.2106, grad_norm: 2.0590
2023-02-19 10:24:00,243 - mmseg - INFO - Iter [79850/160000]	lr: 3.006e-05, eta: 6:21:35, time: 0.291, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1337, decode.acc_seg: 94.2555, aux.loss_ce: 0.0768, aux.acc_seg: 91.8330, loss: 0.2104, grad_norm: 1.9377
2023-02-19 10:24:13,913 - mmseg - INFO - Iter [79900/160000]	lr: 3.004e-05, eta: 6:21:20, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1398, decode.acc_seg: 93.9676, aux.loss_ce: 0.0803, aux.acc_seg: 91.6123, loss: 0.2201, grad_norm: 1.8970
2023-02-19 10:24:27,745 - mmseg - INFO - Iter [79950/160000]	lr: 3.002e-05, eta: 6:21:06, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1288, decode.acc_seg: 94.5460, aux.loss_ce: 0.0775, aux.acc_seg: 92.0009, loss: 0.2063, grad_norm: 2.6279
2023-02-19 10:24:41,535 - mmseg - INFO - Saving checkpoint at 80000 iterations
2023-02-19 10:24:44,804 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:24:44,804 - mmseg - INFO - Iter [80000/160000]	lr: 3.000e-05, eta: 6:20:54, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1367, decode.acc_seg: 94.2525, aux.loss_ce: 0.0802, aux.acc_seg: 91.7869, loss: 0.2170, grad_norm: 2.9634
2023-02-19 10:24:59,557 - mmseg - INFO - per class results:
2023-02-19 10:24:59,563 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        |  78.5 | 86.58 |
|       building      | 82.74 | 90.25 |
|         sky         | 93.95 | 98.15 |
|        floor        | 80.88 | 91.58 |
|         tree        |  75.5 | 86.62 |
|       ceiling       | 83.61 | 95.16 |
|         road        | 84.88 | 92.06 |
|         bed         | 91.12 | 96.06 |
|      windowpane     | 63.73 | 79.76 |
|        grass        | 66.98 | 82.11 |
|       cabinet       | 62.88 | 78.65 |
|       sidewalk      | 70.13 | 83.72 |
|        person       | 81.61 | 92.03 |
|        earth        | 37.54 |  53.4 |
|         door        | 52.46 | 63.22 |
|        table        | 63.45 | 73.32 |
|       mountain      |  58.1 | 75.44 |
|        plant        | 51.25 | 73.42 |
|       curtain       | 76.81 | 86.15 |
|        chair        | 62.73 | 78.17 |
|         car         | 85.15 |  91.8 |
|        water        | 59.37 |  76.2 |
|       painting      | 75.31 | 90.39 |
|         sofa        | 67.92 |  84.1 |
|        shelf        | 43.26 | 56.75 |
|        house        | 40.44 | 50.63 |
|         sea         | 64.87 | 85.29 |
|        mirror       | 71.78 | 84.29 |
|         rug         | 54.78 |  63.6 |
|        field        | 28.88 | 48.85 |
|       armchair      | 50.74 | 72.43 |
|         seat        | 61.56 | 84.34 |
|        fence        | 43.99 | 61.91 |
|         desk        | 50.02 | 67.59 |
|         rock        | 49.29 | 65.09 |
|       wardrobe      | 45.54 | 65.06 |
|         lamp        | 64.58 | 79.64 |
|       bathtub       | 79.46 | 84.03 |
|       railing       | 38.23 | 60.39 |
|       cushion       | 59.36 | 74.44 |
|         base        | 45.14 | 60.28 |
|         box         | 27.62 | 34.46 |
|        column       | 52.93 | 65.13 |
|      signboard      | 39.29 | 55.61 |
|   chest of drawers  |  47.6 | 63.17 |
|       counter       | 27.92 | 35.29 |
|         sand        | 56.61 | 70.78 |
|         sink        | 74.46 | 84.26 |
|      skyscraper     | 57.94 | 71.83 |
|      fireplace      | 74.93 |  89.6 |
|     refrigerator    | 77.51 | 87.29 |
|      grandstand     | 45.74 | 84.26 |
|         path        | 24.74 | 38.54 |
|        stairs       | 31.97 | 38.91 |
|        runway       | 72.63 |  95.6 |
|         case        | 43.95 | 67.95 |
|      pool table     | 93.22 | 96.92 |
|        pillow       | 55.02 | 62.35 |
|     screen door     | 78.62 | 81.64 |
|       stairway      | 32.83 | 38.62 |
|        river        |  8.49 | 13.28 |
|        bridge       | 67.87 | 86.17 |
|       bookcase      | 43.36 | 65.48 |
|        blind        | 35.92 | 38.05 |
|     coffee table    | 61.68 | 78.51 |
|        toilet       | 86.05 | 91.59 |
|        flower       |  38.5 | 55.09 |
|         book        | 45.14 |  76.3 |
|         hill        | 15.17 | 24.91 |
|        bench        | 47.37 | 55.65 |
|      countertop     | 58.14 | 74.93 |
|        stove        | 78.78 | 88.11 |
|         palm        | 56.18 | 80.11 |
|    kitchen island   | 53.88 | 69.45 |
|       computer      | 78.16 | 89.11 |
|     swivel chair    | 40.37 | 52.97 |
|         boat        | 54.52 | 59.28 |
|         bar         | 48.68 | 64.27 |
|    arcade machine   | 74.13 | 77.95 |
|        hovel        | 36.92 | 48.34 |
|         bus         | 89.68 | 96.74 |
|        towel        | 73.13 | 85.08 |
|        light        | 53.71 | 58.83 |
|        truck        | 40.83 | 54.94 |
|        tower        | 35.62 | 69.33 |
|      chandelier     |  68.7 | 86.35 |
|        awning       | 38.09 | 46.53 |
|     streetlight     |  30.3 | 39.33 |
|        booth        | 45.75 | 77.89 |
| television receiver | 71.76 | 82.15 |
|       airplane      | 61.57 | 67.43 |
|      dirt track     | 15.56 | 40.84 |
|       apparel       | 45.72 | 75.86 |
|         pole        | 24.27 | 33.48 |
|         land        |  4.72 |  5.34 |
|      bannister      | 13.07 | 15.67 |
|      escalator      |  48.6 | 67.18 |
|       ottoman       | 46.77 |  63.5 |
|        bottle       |  36.9 | 62.43 |
|        buffet       | 42.62 | 50.27 |
|        poster       | 29.82 | 44.39 |
|        stage        | 14.77 | 22.59 |
|         van         | 39.58 | 53.65 |
|         ship        | 36.59 | 49.39 |
|       fountain      | 25.86 | 26.88 |
|    conveyer belt    | 81.15 | 93.99 |
|        canopy       | 45.17 | 50.95 |
|        washer       | 71.25 | 74.47 |
|      plaything      | 29.94 | 37.93 |
|    swimming pool    | 57.68 | 80.74 |
|        stool        | 41.43 | 51.92 |
|        barrel       | 44.76 |  74.0 |
|        basket       | 38.18 | 63.06 |
|      waterfall      | 47.96 | 55.71 |
|         tent        | 94.11 | 97.65 |
|         bag         |  17.5 | 22.28 |
|       minibike      |  73.6 | 87.55 |
|        cradle       | 84.37 | 96.73 |
|         oven        | 47.98 | 56.73 |
|         ball        | 51.41 |  58.2 |
|         food        | 63.51 | 82.87 |
|         step        | 17.51 |  22.5 |
|         tank        |  59.1 | 63.77 |
|      trade name     | 26.45 | 33.03 |
|      microwave      |  81.6 | 91.86 |
|         pot         | 47.57 | 53.88 |
|        animal       | 65.19 | 67.35 |
|       bicycle       |  57.9 | 80.22 |
|         lake        | 53.96 | 61.98 |
|      dishwasher     | 70.39 | 82.27 |
|        screen       | 56.31 | 66.81 |
|       blanket       | 20.72 | 26.01 |
|      sculpture      | 66.98 | 86.19 |
|         hood        | 68.91 | 72.49 |
|        sconce       | 45.84 | 60.81 |
|         vase        | 44.29 |  57.2 |
|    traffic light    | 36.66 | 53.41 |
|         tray        | 12.18 | 17.19 |
|        ashcan       | 37.73 | 49.93 |
|         fan         | 67.67 | 79.33 |
|         pier        | 32.41 | 42.86 |
|      crt screen     |  4.01 | 10.12 |
|        plate        | 60.39 | 75.03 |
|       monitor       | 20.05 | 23.31 |
|    bulletin board   | 42.16 | 56.15 |
|        shower       |  5.72 | 11.14 |
|       radiator      | 71.29 | 79.79 |
|        glass        | 13.72 | 14.53 |
|        clock        | 38.05 | 48.92 |
|         flag        | 65.18 | 80.73 |
+---------------------+-------+-------+
2023-02-19 10:24:59,563 - mmseg - INFO - Summary:
2023-02-19 10:24:59,564 - mmseg - INFO - 
+-------+-------+-------+
|  aAcc |  mIoU |  mAcc |
+-------+-------+-------+
| 83.84 | 52.38 | 64.96 |
+-------+-------+-------+
2023-02-19 10:25:02,734 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_80000.pth.
2023-02-19 10:25:02,734 - mmseg - INFO - Best mIoU is 0.5238 at 80000 iter.
2023-02-19 10:25:02,735 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:25:02,735 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8384, mIoU: 0.5238, mAcc: 0.6496, IoU.wall: 0.7850, IoU.building: 0.8274, IoU.sky: 0.9395, IoU.floor: 0.8088, IoU.tree: 0.7550, IoU.ceiling: 0.8361, IoU.road: 0.8488, IoU.bed : 0.9112, IoU.windowpane: 0.6373, IoU.grass: 0.6698, IoU.cabinet: 0.6288, IoU.sidewalk: 0.7013, IoU.person: 0.8161, IoU.earth: 0.3754, IoU.door: 0.5246, IoU.table: 0.6345, IoU.mountain: 0.5810, IoU.plant: 0.5125, IoU.curtain: 0.7681, IoU.chair: 0.6273, IoU.car: 0.8515, IoU.water: 0.5937, IoU.painting: 0.7531, IoU.sofa: 0.6792, IoU.shelf: 0.4326, IoU.house: 0.4044, IoU.sea: 0.6487, IoU.mirror: 0.7178, IoU.rug: 0.5478, IoU.field: 0.2888, IoU.armchair: 0.5074, IoU.seat: 0.6156, IoU.fence: 0.4399, IoU.desk: 0.5002, IoU.rock: 0.4929, IoU.wardrobe: 0.4554, IoU.lamp: 0.6458, IoU.bathtub: 0.7946, IoU.railing: 0.3823, IoU.cushion: 0.5936, IoU.base: 0.4514, IoU.box: 0.2762, IoU.column: 0.5293, IoU.signboard: 0.3929, IoU.chest of drawers: 0.4760, IoU.counter: 0.2792, IoU.sand: 0.5661, IoU.sink: 0.7446, IoU.skyscraper: 0.5794, IoU.fireplace: 0.7493, IoU.refrigerator: 0.7751, IoU.grandstand: 0.4574, IoU.path: 0.2474, IoU.stairs: 0.3197, IoU.runway: 0.7263, IoU.case: 0.4395, IoU.pool table: 0.9322, IoU.pillow: 0.5502, IoU.screen door: 0.7862, IoU.stairway: 0.3283, IoU.river: 0.0849, IoU.bridge: 0.6787, IoU.bookcase: 0.4336, IoU.blind: 0.3592, IoU.coffee table: 0.6168, IoU.toilet: 0.8605, IoU.flower: 0.3850, IoU.book: 0.4514, IoU.hill: 0.1517, IoU.bench: 0.4737, IoU.countertop: 0.5814, IoU.stove: 0.7878, IoU.palm: 0.5618, IoU.kitchen island: 0.5388, IoU.computer: 0.7816, IoU.swivel chair: 0.4037, IoU.boat: 0.5452, IoU.bar: 0.4868, IoU.arcade machine: 0.7413, IoU.hovel: 0.3692, IoU.bus: 0.8968, IoU.towel: 0.7313, IoU.light: 0.5371, IoU.truck: 0.4083, IoU.tower: 0.3562, IoU.chandelier: 0.6870, IoU.awning: 0.3809, IoU.streetlight: 0.3030, IoU.booth: 0.4575, IoU.television receiver: 0.7176, IoU.airplane: 0.6157, IoU.dirt track: 0.1556, IoU.apparel: 0.4572, IoU.pole: 0.2427, IoU.land: 0.0472, IoU.bannister: 0.1307, IoU.escalator: 0.4860, IoU.ottoman: 0.4677, IoU.bottle: 0.3690, IoU.buffet: 0.4262, IoU.poster: 0.2982, IoU.stage: 0.1477, IoU.van: 0.3958, IoU.ship: 0.3659, IoU.fountain: 0.2586, IoU.conveyer belt: 0.8115, IoU.canopy: 0.4517, IoU.washer: 0.7125, IoU.plaything: 0.2994, IoU.swimming pool: 0.5768, IoU.stool: 0.4143, IoU.barrel: 0.4476, IoU.basket: 0.3818, IoU.waterfall: 0.4796, IoU.tent: 0.9411, IoU.bag: 0.1750, IoU.minibike: 0.7360, IoU.cradle: 0.8437, IoU.oven: 0.4798, IoU.ball: 0.5141, IoU.food: 0.6351, IoU.step: 0.1751, IoU.tank: 0.5910, IoU.trade name: 0.2645, IoU.microwave: 0.8160, IoU.pot: 0.4757, IoU.animal: 0.6519, IoU.bicycle: 0.5790, IoU.lake: 0.5396, IoU.dishwasher: 0.7039, IoU.screen: 0.5631, IoU.blanket: 0.2072, IoU.sculpture: 0.6698, IoU.hood: 0.6891, IoU.sconce: 0.4584, IoU.vase: 0.4429, IoU.traffic light: 0.3666, IoU.tray: 0.1218, IoU.ashcan: 0.3773, IoU.fan: 0.6767, IoU.pier: 0.3241, IoU.crt screen: 0.0401, IoU.plate: 0.6039, IoU.monitor: 0.2005, IoU.bulletin board: 0.4216, IoU.shower: 0.0572, IoU.radiator: 0.7129, IoU.glass: 0.1372, IoU.clock: 0.3805, IoU.flag: 0.6518, Acc.wall: 0.8658, Acc.building: 0.9025, Acc.sky: 0.9815, Acc.floor: 0.9158, Acc.tree: 0.8662, Acc.ceiling: 0.9516, Acc.road: 0.9206, Acc.bed : 0.9606, Acc.windowpane: 0.7976, Acc.grass: 0.8211, Acc.cabinet: 0.7865, Acc.sidewalk: 0.8372, Acc.person: 0.9203, Acc.earth: 0.5340, Acc.door: 0.6322, Acc.table: 0.7332, Acc.mountain: 0.7544, Acc.plant: 0.7342, Acc.curtain: 0.8615, Acc.chair: 0.7817, Acc.car: 0.9180, Acc.water: 0.7620, Acc.painting: 0.9039, Acc.sofa: 0.8410, Acc.shelf: 0.5675, Acc.house: 0.5063, Acc.sea: 0.8529, Acc.mirror: 0.8429, Acc.rug: 0.6360, Acc.field: 0.4885, Acc.armchair: 0.7243, Acc.seat: 0.8434, Acc.fence: 0.6191, Acc.desk: 0.6759, Acc.rock: 0.6509, Acc.wardrobe: 0.6506, Acc.lamp: 0.7964, Acc.bathtub: 0.8403, Acc.railing: 0.6039, Acc.cushion: 0.7444, Acc.base: 0.6028, Acc.box: 0.3446, Acc.column: 0.6513, Acc.signboard: 0.5561, Acc.chest of drawers: 0.6317, Acc.counter: 0.3529, Acc.sand: 0.7078, Acc.sink: 0.8426, Acc.skyscraper: 0.7183, Acc.fireplace: 0.8960, Acc.refrigerator: 0.8729, Acc.grandstand: 0.8426, Acc.path: 0.3854, Acc.stairs: 0.3891, Acc.runway: 0.9560, Acc.case: 0.6795, Acc.pool table: 0.9692, Acc.pillow: 0.6235, Acc.screen door: 0.8164, Acc.stairway: 0.3862, Acc.river: 0.1328, Acc.bridge: 0.8617, Acc.bookcase: 0.6548, Acc.blind: 0.3805, Acc.coffee table: 0.7851, Acc.toilet: 0.9159, Acc.flower: 0.5509, Acc.book: 0.7630, Acc.hill: 0.2491, Acc.bench: 0.5565, Acc.countertop: 0.7493, Acc.stove: 0.8811, Acc.palm: 0.8011, Acc.kitchen island: 0.6945, Acc.computer: 0.8911, Acc.swivel chair: 0.5297, Acc.boat: 0.5928, Acc.bar: 0.6427, Acc.arcade machine: 0.7795, Acc.hovel: 0.4834, Acc.bus: 0.9674, Acc.towel: 0.8508, Acc.light: 0.5883, Acc.truck: 0.5494, Acc.tower: 0.6933, Acc.chandelier: 0.8635, Acc.awning: 0.4653, Acc.streetlight: 0.3933, Acc.booth: 0.7789, Acc.television receiver: 0.8215, Acc.airplane: 0.6743, Acc.dirt track: 0.4084, Acc.apparel: 0.7586, Acc.pole: 0.3348, Acc.land: 0.0534, Acc.bannister: 0.1567, Acc.escalator: 0.6718, Acc.ottoman: 0.6350, Acc.bottle: 0.6243, Acc.buffet: 0.5027, Acc.poster: 0.4439, Acc.stage: 0.2259, Acc.van: 0.5365, Acc.ship: 0.4939, Acc.fountain: 0.2688, Acc.conveyer belt: 0.9399, Acc.canopy: 0.5095, Acc.washer: 0.7447, Acc.plaything: 0.3793, Acc.swimming pool: 0.8074, Acc.stool: 0.5192, Acc.barrel: 0.7400, Acc.basket: 0.6306, Acc.waterfall: 0.5571, Acc.tent: 0.9765, Acc.bag: 0.2228, Acc.minibike: 0.8755, Acc.cradle: 0.9673, Acc.oven: 0.5673, Acc.ball: 0.5820, Acc.food: 0.8287, Acc.step: 0.2250, Acc.tank: 0.6377, Acc.trade name: 0.3303, Acc.microwave: 0.9186, Acc.pot: 0.5388, Acc.animal: 0.6735, Acc.bicycle: 0.8022, Acc.lake: 0.6198, Acc.dishwasher: 0.8227, Acc.screen: 0.6681, Acc.blanket: 0.2601, Acc.sculpture: 0.8619, Acc.hood: 0.7249, Acc.sconce: 0.6081, Acc.vase: 0.5720, Acc.traffic light: 0.5341, Acc.tray: 0.1719, Acc.ashcan: 0.4993, Acc.fan: 0.7933, Acc.pier: 0.4286, Acc.crt screen: 0.1012, Acc.plate: 0.7503, Acc.monitor: 0.2331, Acc.bulletin board: 0.5615, Acc.shower: 0.1114, Acc.radiator: 0.7979, Acc.glass: 0.1453, Acc.clock: 0.4892, Acc.flag: 0.8073
2023-02-19 10:25:16,726 - mmseg - INFO - Iter [80050/160000]	lr: 2.998e-05, eta: 6:20:57, time: 0.638, data_time: 0.363, memory: 15214, decode.loss_ce: 0.1376, decode.acc_seg: 94.1912, aux.loss_ce: 0.0809, aux.acc_seg: 91.5547, loss: 0.2185, grad_norm: 2.0549
2023-02-19 10:25:30,871 - mmseg - INFO - Iter [80100/160000]	lr: 2.996e-05, eta: 6:20:43, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1423, decode.acc_seg: 94.1242, aux.loss_ce: 0.0800, aux.acc_seg: 91.7861, loss: 0.2223, grad_norm: 2.3455
2023-02-19 10:25:44,581 - mmseg - INFO - Iter [80150/160000]	lr: 2.994e-05, eta: 6:20:28, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1357, decode.acc_seg: 94.2514, aux.loss_ce: 0.0781, aux.acc_seg: 91.7449, loss: 0.2138, grad_norm: 1.8982
2023-02-19 10:25:58,270 - mmseg - INFO - Iter [80200/160000]	lr: 2.993e-05, eta: 6:20:13, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1388, decode.acc_seg: 94.2140, aux.loss_ce: 0.0809, aux.acc_seg: 91.6418, loss: 0.2196, grad_norm: 1.7212
2023-02-19 10:26:12,239 - mmseg - INFO - Iter [80250/160000]	lr: 2.991e-05, eta: 6:19:58, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1389, decode.acc_seg: 94.2046, aux.loss_ce: 0.0811, aux.acc_seg: 91.6825, loss: 0.2200, grad_norm: 2.0851
2023-02-19 10:26:26,013 - mmseg - INFO - Iter [80300/160000]	lr: 2.989e-05, eta: 6:19:44, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1378, decode.acc_seg: 94.1928, aux.loss_ce: 0.0803, aux.acc_seg: 91.6942, loss: 0.2181, grad_norm: 2.4764
2023-02-19 10:26:39,549 - mmseg - INFO - Iter [80350/160000]	lr: 2.987e-05, eta: 6:19:29, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1419, decode.acc_seg: 94.0649, aux.loss_ce: 0.0788, aux.acc_seg: 91.6997, loss: 0.2207, grad_norm: 2.2935
2023-02-19 10:26:53,248 - mmseg - INFO - Iter [80400/160000]	lr: 2.985e-05, eta: 6:19:14, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1333, decode.acc_seg: 94.4815, aux.loss_ce: 0.0765, aux.acc_seg: 92.0625, loss: 0.2098, grad_norm: 1.7844
2023-02-19 10:27:06,880 - mmseg - INFO - Iter [80450/160000]	lr: 2.983e-05, eta: 6:18:59, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1417, decode.acc_seg: 94.1071, aux.loss_ce: 0.0833, aux.acc_seg: 91.3848, loss: 0.2249, grad_norm: 2.1786
2023-02-19 10:27:21,609 - mmseg - INFO - Iter [80500/160000]	lr: 2.981e-05, eta: 6:18:45, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1332, decode.acc_seg: 94.3221, aux.loss_ce: 0.0767, aux.acc_seg: 91.9476, loss: 0.2099, grad_norm: 1.7145
2023-02-19 10:27:35,203 - mmseg - INFO - Iter [80550/160000]	lr: 2.979e-05, eta: 6:18:30, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1371, decode.acc_seg: 94.1976, aux.loss_ce: 0.0800, aux.acc_seg: 91.7048, loss: 0.2171, grad_norm: 2.3361
2023-02-19 10:27:49,077 - mmseg - INFO - Iter [80600/160000]	lr: 2.978e-05, eta: 6:18:15, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1362, decode.acc_seg: 94.3049, aux.loss_ce: 0.0777, aux.acc_seg: 91.9973, loss: 0.2138, grad_norm: 2.4059
2023-02-19 10:28:02,627 - mmseg - INFO - Iter [80650/160000]	lr: 2.976e-05, eta: 6:18:00, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1420, decode.acc_seg: 93.9932, aux.loss_ce: 0.0851, aux.acc_seg: 91.2237, loss: 0.2271, grad_norm: 2.4105
2023-02-19 10:28:17,020 - mmseg - INFO - Iter [80700/160000]	lr: 2.974e-05, eta: 6:17:46, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1457, decode.acc_seg: 93.9000, aux.loss_ce: 0.0819, aux.acc_seg: 91.5218, loss: 0.2275, grad_norm: 2.6529
2023-02-19 10:28:31,119 - mmseg - INFO - Iter [80750/160000]	lr: 2.972e-05, eta: 6:17:32, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1368, decode.acc_seg: 94.0567, aux.loss_ce: 0.0813, aux.acc_seg: 91.4867, loss: 0.2182, grad_norm: 1.7933
2023-02-19 10:28:44,999 - mmseg - INFO - Iter [80800/160000]	lr: 2.970e-05, eta: 6:17:17, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1427, decode.acc_seg: 94.0292, aux.loss_ce: 0.0789, aux.acc_seg: 91.7364, loss: 0.2215, grad_norm: 1.8240
2023-02-19 10:29:00,997 - mmseg - INFO - Iter [80850/160000]	lr: 2.968e-05, eta: 6:17:04, time: 0.320, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1398, decode.acc_seg: 93.9369, aux.loss_ce: 0.0786, aux.acc_seg: 91.7181, loss: 0.2184, grad_norm: 1.9293
2023-02-19 10:29:14,665 - mmseg - INFO - Iter [80900/160000]	lr: 2.966e-05, eta: 6:16:49, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1330, decode.acc_seg: 94.4429, aux.loss_ce: 0.0765, aux.acc_seg: 92.0835, loss: 0.2095, grad_norm: 1.4349
2023-02-19 10:29:28,438 - mmseg - INFO - Iter [80950/160000]	lr: 2.964e-05, eta: 6:16:35, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1292, decode.acc_seg: 94.5364, aux.loss_ce: 0.0749, aux.acc_seg: 92.0224, loss: 0.2041, grad_norm: 2.6281
2023-02-19 10:29:42,269 - mmseg - INFO - Saving checkpoint at 81000 iterations
2023-02-19 10:29:45,503 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:29:45,503 - mmseg - INFO - Iter [81000/160000]	lr: 2.963e-05, eta: 6:16:23, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1269, decode.acc_seg: 94.4591, aux.loss_ce: 0.0745, aux.acc_seg: 92.1523, loss: 0.2014, grad_norm: 1.7733
2023-02-19 10:29:59,175 - mmseg - INFO - Iter [81050/160000]	lr: 2.961e-05, eta: 6:16:08, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1354, decode.acc_seg: 94.2656, aux.loss_ce: 0.0767, aux.acc_seg: 91.9456, loss: 0.2121, grad_norm: 1.9259
2023-02-19 10:30:13,865 - mmseg - INFO - Iter [81100/160000]	lr: 2.959e-05, eta: 6:15:54, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1420, decode.acc_seg: 94.0877, aux.loss_ce: 0.0817, aux.acc_seg: 91.5685, loss: 0.2237, grad_norm: 1.9542
2023-02-19 10:30:28,045 - mmseg - INFO - Iter [81150/160000]	lr: 2.957e-05, eta: 6:15:40, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1315, decode.acc_seg: 94.3865, aux.loss_ce: 0.0754, aux.acc_seg: 92.0649, loss: 0.2070, grad_norm: 1.8189
2023-02-19 10:30:41,966 - mmseg - INFO - Iter [81200/160000]	lr: 2.955e-05, eta: 6:15:25, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1365, decode.acc_seg: 94.3433, aux.loss_ce: 0.0763, aux.acc_seg: 92.1506, loss: 0.2128, grad_norm: 2.0162
2023-02-19 10:30:56,722 - mmseg - INFO - Iter [81250/160000]	lr: 2.953e-05, eta: 6:15:11, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1227, decode.acc_seg: 94.7402, aux.loss_ce: 0.0720, aux.acc_seg: 92.5382, loss: 0.1947, grad_norm: 1.3943
2023-02-19 10:31:10,739 - mmseg - INFO - Iter [81300/160000]	lr: 2.951e-05, eta: 6:14:57, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1344, decode.acc_seg: 94.2844, aux.loss_ce: 0.0774, aux.acc_seg: 91.8124, loss: 0.2119, grad_norm: 1.5757
2023-02-19 10:31:25,567 - mmseg - INFO - Iter [81350/160000]	lr: 2.949e-05, eta: 6:14:43, time: 0.297, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1374, decode.acc_seg: 94.2537, aux.loss_ce: 0.0781, aux.acc_seg: 91.8871, loss: 0.2155, grad_norm: 1.8919
2023-02-19 10:31:40,004 - mmseg - INFO - Iter [81400/160000]	lr: 2.948e-05, eta: 6:14:29, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1270, decode.acc_seg: 94.5495, aux.loss_ce: 0.0730, aux.acc_seg: 92.2551, loss: 0.2000, grad_norm: 1.5977
2023-02-19 10:31:54,221 - mmseg - INFO - Iter [81450/160000]	lr: 2.946e-05, eta: 6:14:14, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1429, decode.acc_seg: 94.0932, aux.loss_ce: 0.0838, aux.acc_seg: 91.5292, loss: 0.2266, grad_norm: 1.9989
2023-02-19 10:32:07,825 - mmseg - INFO - Iter [81500/160000]	lr: 2.944e-05, eta: 6:13:59, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1276, decode.acc_seg: 94.6869, aux.loss_ce: 0.0760, aux.acc_seg: 92.0577, loss: 0.2036, grad_norm: 1.5958
2023-02-19 10:32:21,921 - mmseg - INFO - Iter [81550/160000]	lr: 2.942e-05, eta: 6:13:45, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1400, decode.acc_seg: 94.1796, aux.loss_ce: 0.0801, aux.acc_seg: 91.7547, loss: 0.2201, grad_norm: 2.2510
2023-02-19 10:32:35,947 - mmseg - INFO - Iter [81600/160000]	lr: 2.940e-05, eta: 6:13:30, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1398, decode.acc_seg: 94.0675, aux.loss_ce: 0.0802, aux.acc_seg: 91.7583, loss: 0.2200, grad_norm: 1.8002
2023-02-19 10:32:49,747 - mmseg - INFO - Iter [81650/160000]	lr: 2.938e-05, eta: 6:13:16, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1435, decode.acc_seg: 94.1372, aux.loss_ce: 0.0826, aux.acc_seg: 91.6201, loss: 0.2261, grad_norm: 1.9550
2023-02-19 10:33:03,720 - mmseg - INFO - Iter [81700/160000]	lr: 2.936e-05, eta: 6:13:01, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1382, decode.acc_seg: 94.0834, aux.loss_ce: 0.0766, aux.acc_seg: 91.8974, loss: 0.2148, grad_norm: 2.1062
2023-02-19 10:33:17,415 - mmseg - INFO - Iter [81750/160000]	lr: 2.934e-05, eta: 6:12:46, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1301, decode.acc_seg: 94.4387, aux.loss_ce: 0.0758, aux.acc_seg: 91.8901, loss: 0.2059, grad_norm: 1.4769
2023-02-19 10:33:31,337 - mmseg - INFO - Iter [81800/160000]	lr: 2.933e-05, eta: 6:12:32, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1338, decode.acc_seg: 94.1850, aux.loss_ce: 0.0766, aux.acc_seg: 91.7709, loss: 0.2104, grad_norm: 1.8285
2023-02-19 10:33:45,034 - mmseg - INFO - Iter [81850/160000]	lr: 2.931e-05, eta: 6:12:17, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1423, decode.acc_seg: 94.1013, aux.loss_ce: 0.0813, aux.acc_seg: 91.6147, loss: 0.2236, grad_norm: 1.9103
2023-02-19 10:33:59,095 - mmseg - INFO - Iter [81900/160000]	lr: 2.929e-05, eta: 6:12:02, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1297, decode.acc_seg: 94.5088, aux.loss_ce: 0.0790, aux.acc_seg: 91.8269, loss: 0.2087, grad_norm: 1.9313
2023-02-19 10:34:14,114 - mmseg - INFO - Iter [81950/160000]	lr: 2.927e-05, eta: 6:11:49, time: 0.300, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1339, decode.acc_seg: 94.3068, aux.loss_ce: 0.0750, aux.acc_seg: 92.0191, loss: 0.2088, grad_norm: 1.8155
2023-02-19 10:34:27,844 - mmseg - INFO - Saving checkpoint at 82000 iterations
2023-02-19 10:34:31,088 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:34:31,088 - mmseg - INFO - Iter [82000/160000]	lr: 2.925e-05, eta: 6:11:37, time: 0.340, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1330, decode.acc_seg: 94.3821, aux.loss_ce: 0.0788, aux.acc_seg: 91.8692, loss: 0.2118, grad_norm: 2.0907
2023-02-19 10:34:45,456 - mmseg - INFO - Iter [82050/160000]	lr: 2.923e-05, eta: 6:11:23, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1411, decode.acc_seg: 94.2117, aux.loss_ce: 0.0832, aux.acc_seg: 91.5021, loss: 0.2243, grad_norm: 1.9295
2023-02-19 10:35:01,464 - mmseg - INFO - Iter [82100/160000]	lr: 2.921e-05, eta: 6:11:10, time: 0.321, data_time: 0.049, memory: 15214, decode.loss_ce: 0.1444, decode.acc_seg: 93.8669, aux.loss_ce: 0.0805, aux.acc_seg: 91.4213, loss: 0.2249, grad_norm: 1.8406
2023-02-19 10:35:15,167 - mmseg - INFO - Iter [82150/160000]	lr: 2.919e-05, eta: 6:10:55, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1362, decode.acc_seg: 94.1542, aux.loss_ce: 0.0777, aux.acc_seg: 91.8331, loss: 0.2139, grad_norm: 1.8304
2023-02-19 10:35:28,857 - mmseg - INFO - Iter [82200/160000]	lr: 2.918e-05, eta: 6:10:40, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1286, decode.acc_seg: 94.5449, aux.loss_ce: 0.0770, aux.acc_seg: 91.9489, loss: 0.2056, grad_norm: 2.0386
2023-02-19 10:35:42,487 - mmseg - INFO - Iter [82250/160000]	lr: 2.916e-05, eta: 6:10:25, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1440, decode.acc_seg: 94.0882, aux.loss_ce: 0.0784, aux.acc_seg: 91.9370, loss: 0.2225, grad_norm: 3.1255
2023-02-19 10:35:57,142 - mmseg - INFO - Iter [82300/160000]	lr: 2.914e-05, eta: 6:10:11, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1285, decode.acc_seg: 94.6210, aux.loss_ce: 0.0760, aux.acc_seg: 92.2399, loss: 0.2045, grad_norm: 1.7285
2023-02-19 10:36:11,489 - mmseg - INFO - Iter [82350/160000]	lr: 2.912e-05, eta: 6:09:57, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1383, decode.acc_seg: 94.1352, aux.loss_ce: 0.0780, aux.acc_seg: 91.8150, loss: 0.2163, grad_norm: 2.2918
2023-02-19 10:36:25,398 - mmseg - INFO - Iter [82400/160000]	lr: 2.910e-05, eta: 6:09:42, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1338, decode.acc_seg: 94.3917, aux.loss_ce: 0.0763, aux.acc_seg: 92.1054, loss: 0.2102, grad_norm: 2.0018
2023-02-19 10:36:39,609 - mmseg - INFO - Iter [82450/160000]	lr: 2.908e-05, eta: 6:09:28, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1367, decode.acc_seg: 94.2529, aux.loss_ce: 0.0790, aux.acc_seg: 91.8064, loss: 0.2157, grad_norm: 1.8099
2023-02-19 10:36:53,449 - mmseg - INFO - Iter [82500/160000]	lr: 2.906e-05, eta: 6:09:13, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1307, decode.acc_seg: 94.4268, aux.loss_ce: 0.0779, aux.acc_seg: 91.8711, loss: 0.2086, grad_norm: 1.9683
2023-02-19 10:37:07,359 - mmseg - INFO - Iter [82550/160000]	lr: 2.904e-05, eta: 6:08:59, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1280, decode.acc_seg: 94.6210, aux.loss_ce: 0.0758, aux.acc_seg: 92.1360, loss: 0.2037, grad_norm: 1.6754
2023-02-19 10:37:22,439 - mmseg - INFO - Iter [82600/160000]	lr: 2.903e-05, eta: 6:08:45, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1311, decode.acc_seg: 94.4136, aux.loss_ce: 0.0789, aux.acc_seg: 91.7110, loss: 0.2100, grad_norm: 1.6890
2023-02-19 10:37:37,239 - mmseg - INFO - Iter [82650/160000]	lr: 2.901e-05, eta: 6:08:31, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1396, decode.acc_seg: 94.1213, aux.loss_ce: 0.0791, aux.acc_seg: 91.8716, loss: 0.2186, grad_norm: 1.6483
2023-02-19 10:37:51,499 - mmseg - INFO - Iter [82700/160000]	lr: 2.899e-05, eta: 6:08:17, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1330, decode.acc_seg: 94.3129, aux.loss_ce: 0.0750, aux.acc_seg: 91.9969, loss: 0.2080, grad_norm: 1.6346
2023-02-19 10:38:05,775 - mmseg - INFO - Iter [82750/160000]	lr: 2.897e-05, eta: 6:08:03, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1302, decode.acc_seg: 94.3596, aux.loss_ce: 0.0775, aux.acc_seg: 91.8890, loss: 0.2077, grad_norm: 1.6881
2023-02-19 10:38:20,051 - mmseg - INFO - Iter [82800/160000]	lr: 2.895e-05, eta: 6:07:48, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1345, decode.acc_seg: 94.3867, aux.loss_ce: 0.0800, aux.acc_seg: 91.7130, loss: 0.2145, grad_norm: 2.0175
2023-02-19 10:38:33,860 - mmseg - INFO - Iter [82850/160000]	lr: 2.893e-05, eta: 6:07:34, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1384, decode.acc_seg: 94.2083, aux.loss_ce: 0.0799, aux.acc_seg: 91.7467, loss: 0.2183, grad_norm: 1.9938
2023-02-19 10:38:47,757 - mmseg - INFO - Iter [82900/160000]	lr: 2.891e-05, eta: 6:07:19, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1362, decode.acc_seg: 94.1576, aux.loss_ce: 0.0801, aux.acc_seg: 91.7456, loss: 0.2162, grad_norm: 2.0355
2023-02-19 10:39:01,711 - mmseg - INFO - Iter [82950/160000]	lr: 2.889e-05, eta: 6:07:04, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1262, decode.acc_seg: 94.7333, aux.loss_ce: 0.0753, aux.acc_seg: 92.1819, loss: 0.2015, grad_norm: 1.6618
2023-02-19 10:39:15,347 - mmseg - INFO - Saving checkpoint at 83000 iterations
2023-02-19 10:39:18,577 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:39:18,577 - mmseg - INFO - Iter [83000/160000]	lr: 2.888e-05, eta: 6:06:52, time: 0.337, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1357, decode.acc_seg: 94.1936, aux.loss_ce: 0.0810, aux.acc_seg: 91.6455, loss: 0.2166, grad_norm: 1.6719
2023-02-19 10:39:32,165 - mmseg - INFO - Iter [83050/160000]	lr: 2.886e-05, eta: 6:06:38, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1358, decode.acc_seg: 94.2585, aux.loss_ce: 0.0792, aux.acc_seg: 91.6699, loss: 0.2151, grad_norm: 1.8542
2023-02-19 10:39:46,314 - mmseg - INFO - Iter [83100/160000]	lr: 2.884e-05, eta: 6:06:23, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1345, decode.acc_seg: 94.1802, aux.loss_ce: 0.0778, aux.acc_seg: 91.8937, loss: 0.2122, grad_norm: 1.9583
2023-02-19 10:40:00,365 - mmseg - INFO - Iter [83150/160000]	lr: 2.882e-05, eta: 6:06:09, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1390, decode.acc_seg: 93.8914, aux.loss_ce: 0.0786, aux.acc_seg: 91.5517, loss: 0.2176, grad_norm: 1.8926
2023-02-19 10:40:14,571 - mmseg - INFO - Iter [83200/160000]	lr: 2.880e-05, eta: 6:05:54, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1422, decode.acc_seg: 94.0581, aux.loss_ce: 0.0801, aux.acc_seg: 91.7423, loss: 0.2222, grad_norm: 1.9173
2023-02-19 10:40:28,160 - mmseg - INFO - Iter [83250/160000]	lr: 2.878e-05, eta: 6:05:39, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1256, decode.acc_seg: 94.7608, aux.loss_ce: 0.0731, aux.acc_seg: 92.3969, loss: 0.1987, grad_norm: 1.3267
2023-02-19 10:40:41,763 - mmseg - INFO - Iter [83300/160000]	lr: 2.876e-05, eta: 6:05:24, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1309, decode.acc_seg: 94.4481, aux.loss_ce: 0.0757, aux.acc_seg: 92.0295, loss: 0.2066, grad_norm: 2.0819
2023-02-19 10:40:55,513 - mmseg - INFO - Iter [83350/160000]	lr: 2.874e-05, eta: 6:05:10, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1298, decode.acc_seg: 94.5144, aux.loss_ce: 0.0757, aux.acc_seg: 92.1655, loss: 0.2055, grad_norm: 2.6846
2023-02-19 10:41:11,992 - mmseg - INFO - Iter [83400/160000]	lr: 2.873e-05, eta: 6:04:57, time: 0.330, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1261, decode.acc_seg: 94.6286, aux.loss_ce: 0.0720, aux.acc_seg: 92.4307, loss: 0.1980, grad_norm: 1.7523
2023-02-19 10:41:25,659 - mmseg - INFO - Iter [83450/160000]	lr: 2.871e-05, eta: 6:04:42, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1318, decode.acc_seg: 94.4213, aux.loss_ce: 0.0755, aux.acc_seg: 92.1807, loss: 0.2074, grad_norm: 1.5748
2023-02-19 10:41:40,123 - mmseg - INFO - Iter [83500/160000]	lr: 2.869e-05, eta: 6:04:28, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1285, decode.acc_seg: 94.6295, aux.loss_ce: 0.0747, aux.acc_seg: 92.2558, loss: 0.2032, grad_norm: 1.7214
2023-02-19 10:41:54,318 - mmseg - INFO - Iter [83550/160000]	lr: 2.867e-05, eta: 6:04:14, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1354, decode.acc_seg: 94.1893, aux.loss_ce: 0.0802, aux.acc_seg: 91.6618, loss: 0.2156, grad_norm: 1.9333
2023-02-19 10:42:08,061 - mmseg - INFO - Iter [83600/160000]	lr: 2.865e-05, eta: 6:03:59, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1355, decode.acc_seg: 94.1789, aux.loss_ce: 0.0796, aux.acc_seg: 91.4512, loss: 0.2151, grad_norm: 2.1873
2023-02-19 10:42:21,901 - mmseg - INFO - Iter [83650/160000]	lr: 2.863e-05, eta: 6:03:44, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1229, decode.acc_seg: 94.7061, aux.loss_ce: 0.0732, aux.acc_seg: 92.2468, loss: 0.1961, grad_norm: 1.7714
2023-02-19 10:42:35,821 - mmseg - INFO - Iter [83700/160000]	lr: 2.861e-05, eta: 6:03:30, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1316, decode.acc_seg: 94.4119, aux.loss_ce: 0.0776, aux.acc_seg: 91.9564, loss: 0.2092, grad_norm: 2.1808
2023-02-19 10:42:49,938 - mmseg - INFO - Iter [83750/160000]	lr: 2.859e-05, eta: 6:03:15, time: 0.284, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1319, decode.acc_seg: 94.4857, aux.loss_ce: 0.0775, aux.acc_seg: 91.9135, loss: 0.2095, grad_norm: 2.2125
2023-02-19 10:43:03,717 - mmseg - INFO - Iter [83800/160000]	lr: 2.858e-05, eta: 6:03:01, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1304, decode.acc_seg: 94.4629, aux.loss_ce: 0.0739, aux.acc_seg: 92.1475, loss: 0.2042, grad_norm: 1.3204
2023-02-19 10:43:18,292 - mmseg - INFO - Iter [83850/160000]	lr: 2.856e-05, eta: 6:02:46, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1289, decode.acc_seg: 94.4526, aux.loss_ce: 0.0738, aux.acc_seg: 92.1524, loss: 0.2026, grad_norm: 2.0322
2023-02-19 10:43:32,008 - mmseg - INFO - Iter [83900/160000]	lr: 2.854e-05, eta: 6:02:32, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1366, decode.acc_seg: 94.1743, aux.loss_ce: 0.0800, aux.acc_seg: 91.7641, loss: 0.2166, grad_norm: 2.0138
2023-02-19 10:43:46,173 - mmseg - INFO - Iter [83950/160000]	lr: 2.852e-05, eta: 6:02:17, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1276, decode.acc_seg: 94.5463, aux.loss_ce: 0.0760, aux.acc_seg: 92.0074, loss: 0.2035, grad_norm: 2.0116
2023-02-19 10:44:00,077 - mmseg - INFO - Saving checkpoint at 84000 iterations
2023-02-19 10:44:03,374 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:44:03,375 - mmseg - INFO - Iter [84000/160000]	lr: 2.850e-05, eta: 6:02:06, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1387, decode.acc_seg: 94.1021, aux.loss_ce: 0.0803, aux.acc_seg: 91.6208, loss: 0.2191, grad_norm: 2.0667
2023-02-19 10:44:17,903 - mmseg - INFO - Iter [84050/160000]	lr: 2.848e-05, eta: 6:01:52, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1253, decode.acc_seg: 94.5696, aux.loss_ce: 0.0771, aux.acc_seg: 91.9379, loss: 0.2025, grad_norm: 1.7066
2023-02-19 10:44:32,035 - mmseg - INFO - Iter [84100/160000]	lr: 2.846e-05, eta: 6:01:37, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1323, decode.acc_seg: 94.4571, aux.loss_ce: 0.0754, aux.acc_seg: 92.2004, loss: 0.2077, grad_norm: 1.7771
2023-02-19 10:44:46,260 - mmseg - INFO - Iter [84150/160000]	lr: 2.844e-05, eta: 6:01:23, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1291, decode.acc_seg: 94.4359, aux.loss_ce: 0.0726, aux.acc_seg: 92.2232, loss: 0.2017, grad_norm: 1.6384
2023-02-19 10:45:00,035 - mmseg - INFO - Iter [84200/160000]	lr: 2.843e-05, eta: 6:01:08, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1280, decode.acc_seg: 94.5425, aux.loss_ce: 0.0744, aux.acc_seg: 92.1912, loss: 0.2024, grad_norm: 1.8694
2023-02-19 10:45:15,146 - mmseg - INFO - Iter [84250/160000]	lr: 2.841e-05, eta: 6:00:54, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1409, decode.acc_seg: 93.9818, aux.loss_ce: 0.0827, aux.acc_seg: 91.5521, loss: 0.2235, grad_norm: 3.3413
2023-02-19 10:45:28,725 - mmseg - INFO - Iter [84300/160000]	lr: 2.839e-05, eta: 6:00:39, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1258, decode.acc_seg: 94.7064, aux.loss_ce: 0.0739, aux.acc_seg: 92.2946, loss: 0.1998, grad_norm: 1.5593
2023-02-19 10:45:42,781 - mmseg - INFO - Iter [84350/160000]	lr: 2.837e-05, eta: 6:00:25, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1322, decode.acc_seg: 94.3657, aux.loss_ce: 0.0811, aux.acc_seg: 91.5413, loss: 0.2132, grad_norm: 1.7817
2023-02-19 10:45:57,072 - mmseg - INFO - Iter [84400/160000]	lr: 2.835e-05, eta: 6:00:11, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1338, decode.acc_seg: 94.3073, aux.loss_ce: 0.0792, aux.acc_seg: 91.6787, loss: 0.2130, grad_norm: 2.0655
2023-02-19 10:46:10,687 - mmseg - INFO - Iter [84450/160000]	lr: 2.833e-05, eta: 5:59:56, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1281, decode.acc_seg: 94.4569, aux.loss_ce: 0.0733, aux.acc_seg: 92.1343, loss: 0.2013, grad_norm: 1.5668
2023-02-19 10:46:24,771 - mmseg - INFO - Iter [84500/160000]	lr: 2.831e-05, eta: 5:59:41, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1351, decode.acc_seg: 94.4010, aux.loss_ce: 0.0785, aux.acc_seg: 92.0174, loss: 0.2136, grad_norm: 1.7742
2023-02-19 10:46:38,932 - mmseg - INFO - Iter [84550/160000]	lr: 2.829e-05, eta: 5:59:27, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1363, decode.acc_seg: 94.1431, aux.loss_ce: 0.0783, aux.acc_seg: 91.7015, loss: 0.2145, grad_norm: 1.8932
2023-02-19 10:46:52,771 - mmseg - INFO - Iter [84600/160000]	lr: 2.828e-05, eta: 5:59:12, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1332, decode.acc_seg: 94.4918, aux.loss_ce: 0.0762, aux.acc_seg: 92.1420, loss: 0.2094, grad_norm: 1.5973
2023-02-19 10:47:09,401 - mmseg - INFO - Iter [84650/160000]	lr: 2.826e-05, eta: 5:59:00, time: 0.333, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1402, decode.acc_seg: 93.9928, aux.loss_ce: 0.0811, aux.acc_seg: 91.5845, loss: 0.2213, grad_norm: 2.0116
2023-02-19 10:47:23,076 - mmseg - INFO - Iter [84700/160000]	lr: 2.824e-05, eta: 5:58:45, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1310, decode.acc_seg: 94.5146, aux.loss_ce: 0.0739, aux.acc_seg: 92.3314, loss: 0.2049, grad_norm: 1.7943
2023-02-19 10:47:37,847 - mmseg - INFO - Iter [84750/160000]	lr: 2.822e-05, eta: 5:58:31, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1327, decode.acc_seg: 94.5127, aux.loss_ce: 0.0776, aux.acc_seg: 92.0680, loss: 0.2103, grad_norm: 2.2291
2023-02-19 10:47:51,501 - mmseg - INFO - Iter [84800/160000]	lr: 2.820e-05, eta: 5:58:16, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1290, decode.acc_seg: 94.5082, aux.loss_ce: 0.0737, aux.acc_seg: 92.2642, loss: 0.2027, grad_norm: 1.8355
2023-02-19 10:48:05,124 - mmseg - INFO - Iter [84850/160000]	lr: 2.818e-05, eta: 5:58:02, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1265, decode.acc_seg: 94.5669, aux.loss_ce: 0.0743, aux.acc_seg: 92.1762, loss: 0.2009, grad_norm: 1.7171
2023-02-19 10:48:18,726 - mmseg - INFO - Iter [84900/160000]	lr: 2.816e-05, eta: 5:57:47, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1241, decode.acc_seg: 94.7051, aux.loss_ce: 0.0736, aux.acc_seg: 92.3407, loss: 0.1977, grad_norm: 1.7341
2023-02-19 10:48:33,137 - mmseg - INFO - Iter [84950/160000]	lr: 2.814e-05, eta: 5:57:32, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1257, decode.acc_seg: 94.6499, aux.loss_ce: 0.0741, aux.acc_seg: 92.1195, loss: 0.1998, grad_norm: 1.6069
2023-02-19 10:48:47,231 - mmseg - INFO - Saving checkpoint at 85000 iterations
2023-02-19 10:48:50,441 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:48:50,441 - mmseg - INFO - Iter [85000/160000]	lr: 2.813e-05, eta: 5:57:21, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1370, decode.acc_seg: 94.1626, aux.loss_ce: 0.0802, aux.acc_seg: 91.6481, loss: 0.2172, grad_norm: 2.5907
2023-02-19 10:49:04,994 - mmseg - INFO - Iter [85050/160000]	lr: 2.811e-05, eta: 5:57:07, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1362, decode.acc_seg: 94.2523, aux.loss_ce: 0.0782, aux.acc_seg: 91.7597, loss: 0.2144, grad_norm: 1.8795
2023-02-19 10:49:18,680 - mmseg - INFO - Iter [85100/160000]	lr: 2.809e-05, eta: 5:56:52, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1297, decode.acc_seg: 94.5680, aux.loss_ce: 0.0741, aux.acc_seg: 92.4147, loss: 0.2038, grad_norm: 1.9693
2023-02-19 10:49:32,761 - mmseg - INFO - Iter [85150/160000]	lr: 2.807e-05, eta: 5:56:37, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1418, decode.acc_seg: 94.1336, aux.loss_ce: 0.0779, aux.acc_seg: 91.7894, loss: 0.2197, grad_norm: 1.9766
2023-02-19 10:49:46,999 - mmseg - INFO - Iter [85200/160000]	lr: 2.805e-05, eta: 5:56:23, time: 0.285, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1319, decode.acc_seg: 94.2661, aux.loss_ce: 0.0790, aux.acc_seg: 91.6346, loss: 0.2109, grad_norm: 1.7813
2023-02-19 10:50:01,059 - mmseg - INFO - Iter [85250/160000]	lr: 2.803e-05, eta: 5:56:09, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1392, decode.acc_seg: 94.2281, aux.loss_ce: 0.0787, aux.acc_seg: 91.9169, loss: 0.2179, grad_norm: 2.2609
2023-02-19 10:50:16,310 - mmseg - INFO - Iter [85300/160000]	lr: 2.801e-05, eta: 5:55:55, time: 0.305, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1388, decode.acc_seg: 94.1712, aux.loss_ce: 0.0790, aux.acc_seg: 91.7028, loss: 0.2179, grad_norm: 2.0209
2023-02-19 10:50:30,230 - mmseg - INFO - Iter [85350/160000]	lr: 2.799e-05, eta: 5:55:40, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1384, decode.acc_seg: 94.0969, aux.loss_ce: 0.0800, aux.acc_seg: 91.7250, loss: 0.2185, grad_norm: 1.7972
2023-02-19 10:50:44,241 - mmseg - INFO - Iter [85400/160000]	lr: 2.798e-05, eta: 5:55:26, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1373, decode.acc_seg: 94.2637, aux.loss_ce: 0.0789, aux.acc_seg: 91.8259, loss: 0.2163, grad_norm: 2.3262
2023-02-19 10:50:57,965 - mmseg - INFO - Iter [85450/160000]	lr: 2.796e-05, eta: 5:55:11, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1376, decode.acc_seg: 94.2547, aux.loss_ce: 0.0812, aux.acc_seg: 91.5718, loss: 0.2188, grad_norm: 2.0345
2023-02-19 10:51:12,003 - mmseg - INFO - Iter [85500/160000]	lr: 2.794e-05, eta: 5:54:57, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1320, decode.acc_seg: 94.3732, aux.loss_ce: 0.0759, aux.acc_seg: 92.0584, loss: 0.2079, grad_norm: 1.8205
2023-02-19 10:51:25,712 - mmseg - INFO - Iter [85550/160000]	lr: 2.792e-05, eta: 5:54:42, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1242, decode.acc_seg: 94.6230, aux.loss_ce: 0.0720, aux.acc_seg: 92.2323, loss: 0.1962, grad_norm: 1.7471
2023-02-19 10:51:40,213 - mmseg - INFO - Iter [85600/160000]	lr: 2.790e-05, eta: 5:54:28, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1376, decode.acc_seg: 94.1993, aux.loss_ce: 0.0794, aux.acc_seg: 91.7589, loss: 0.2170, grad_norm: 1.8523
2023-02-19 10:51:54,054 - mmseg - INFO - Iter [85650/160000]	lr: 2.788e-05, eta: 5:54:13, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1384, decode.acc_seg: 94.1066, aux.loss_ce: 0.0778, aux.acc_seg: 91.8317, loss: 0.2162, grad_norm: 2.1203
2023-02-19 10:52:07,963 - mmseg - INFO - Iter [85700/160000]	lr: 2.786e-05, eta: 5:53:58, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1231, decode.acc_seg: 94.7740, aux.loss_ce: 0.0708, aux.acc_seg: 92.5130, loss: 0.1939, grad_norm: 1.6353
2023-02-19 10:52:22,235 - mmseg - INFO - Iter [85750/160000]	lr: 2.784e-05, eta: 5:53:44, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1328, decode.acc_seg: 94.1850, aux.loss_ce: 0.0749, aux.acc_seg: 91.9096, loss: 0.2078, grad_norm: 2.1527
2023-02-19 10:52:36,138 - mmseg - INFO - Iter [85800/160000]	lr: 2.783e-05, eta: 5:53:29, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1293, decode.acc_seg: 94.5511, aux.loss_ce: 0.0773, aux.acc_seg: 91.9281, loss: 0.2067, grad_norm: 1.7944
2023-02-19 10:52:50,138 - mmseg - INFO - Iter [85850/160000]	lr: 2.781e-05, eta: 5:53:15, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1261, decode.acc_seg: 94.5512, aux.loss_ce: 0.0741, aux.acc_seg: 92.0349, loss: 0.2003, grad_norm: 1.6110
2023-02-19 10:53:06,386 - mmseg - INFO - Iter [85900/160000]	lr: 2.779e-05, eta: 5:53:02, time: 0.325, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1294, decode.acc_seg: 94.4302, aux.loss_ce: 0.0751, aux.acc_seg: 92.0689, loss: 0.2044, grad_norm: 1.7099
2023-02-19 10:53:20,066 - mmseg - INFO - Iter [85950/160000]	lr: 2.777e-05, eta: 5:52:47, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1259, decode.acc_seg: 94.5167, aux.loss_ce: 0.0715, aux.acc_seg: 92.2819, loss: 0.1974, grad_norm: 1.5653
2023-02-19 10:53:34,227 - mmseg - INFO - Saving checkpoint at 86000 iterations
2023-02-19 10:53:37,554 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:53:37,554 - mmseg - INFO - Iter [86000/160000]	lr: 2.775e-05, eta: 5:52:36, time: 0.350, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1357, decode.acc_seg: 94.4347, aux.loss_ce: 0.0785, aux.acc_seg: 91.9631, loss: 0.2143, grad_norm: 1.9426
2023-02-19 10:53:51,619 - mmseg - INFO - Iter [86050/160000]	lr: 2.773e-05, eta: 5:52:21, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1220, decode.acc_seg: 94.6906, aux.loss_ce: 0.0714, aux.acc_seg: 92.3648, loss: 0.1934, grad_norm: 1.6382
2023-02-19 10:54:05,586 - mmseg - INFO - Iter [86100/160000]	lr: 2.771e-05, eta: 5:52:07, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1310, decode.acc_seg: 94.4365, aux.loss_ce: 0.0754, aux.acc_seg: 92.2407, loss: 0.2064, grad_norm: 2.0875
2023-02-19 10:54:19,560 - mmseg - INFO - Iter [86150/160000]	lr: 2.769e-05, eta: 5:51:52, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1254, decode.acc_seg: 94.5201, aux.loss_ce: 0.0718, aux.acc_seg: 92.3509, loss: 0.1972, grad_norm: 1.5100
2023-02-19 10:54:33,483 - mmseg - INFO - Iter [86200/160000]	lr: 2.768e-05, eta: 5:51:38, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1213, decode.acc_seg: 94.8328, aux.loss_ce: 0.0700, aux.acc_seg: 92.6144, loss: 0.1912, grad_norm: 1.6335
2023-02-19 10:54:47,523 - mmseg - INFO - Iter [86250/160000]	lr: 2.766e-05, eta: 5:51:23, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1255, decode.acc_seg: 94.4896, aux.loss_ce: 0.0726, aux.acc_seg: 92.2850, loss: 0.1981, grad_norm: 1.8298
2023-02-19 10:55:01,762 - mmseg - INFO - Iter [86300/160000]	lr: 2.764e-05, eta: 5:51:09, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1339, decode.acc_seg: 94.4095, aux.loss_ce: 0.0767, aux.acc_seg: 92.1573, loss: 0.2106, grad_norm: 1.5500
2023-02-19 10:55:15,897 - mmseg - INFO - Iter [86350/160000]	lr: 2.762e-05, eta: 5:50:54, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1341, decode.acc_seg: 94.3428, aux.loss_ce: 0.0760, aux.acc_seg: 91.9832, loss: 0.2100, grad_norm: 1.7301
2023-02-19 10:55:30,633 - mmseg - INFO - Iter [86400/160000]	lr: 2.760e-05, eta: 5:50:40, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1366, decode.acc_seg: 94.3078, aux.loss_ce: 0.0783, aux.acc_seg: 92.0408, loss: 0.2149, grad_norm: 2.2570
2023-02-19 10:55:44,905 - mmseg - INFO - Iter [86450/160000]	lr: 2.758e-05, eta: 5:50:26, time: 0.286, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1252, decode.acc_seg: 94.6196, aux.loss_ce: 0.0709, aux.acc_seg: 92.4273, loss: 0.1961, grad_norm: 1.9750
2023-02-19 10:55:58,719 - mmseg - INFO - Iter [86500/160000]	lr: 2.756e-05, eta: 5:50:11, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1264, decode.acc_seg: 94.6876, aux.loss_ce: 0.0746, aux.acc_seg: 92.2711, loss: 0.2010, grad_norm: 1.5552
2023-02-19 10:56:13,604 - mmseg - INFO - Iter [86550/160000]	lr: 2.754e-05, eta: 5:49:58, time: 0.298, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1378, decode.acc_seg: 94.1817, aux.loss_ce: 0.0791, aux.acc_seg: 91.6777, loss: 0.2170, grad_norm: 2.2963
2023-02-19 10:56:27,757 - mmseg - INFO - Iter [86600/160000]	lr: 2.753e-05, eta: 5:49:43, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1324, decode.acc_seg: 94.5397, aux.loss_ce: 0.0783, aux.acc_seg: 91.9518, loss: 0.2108, grad_norm: 1.8226
2023-02-19 10:56:42,163 - mmseg - INFO - Iter [86650/160000]	lr: 2.751e-05, eta: 5:49:29, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1208, decode.acc_seg: 94.7469, aux.loss_ce: 0.0732, aux.acc_seg: 92.2786, loss: 0.1940, grad_norm: 1.9209
2023-02-19 10:56:56,029 - mmseg - INFO - Iter [86700/160000]	lr: 2.749e-05, eta: 5:49:14, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1330, decode.acc_seg: 94.4373, aux.loss_ce: 0.0766, aux.acc_seg: 92.0827, loss: 0.2095, grad_norm: 1.7435
2023-02-19 10:57:09,853 - mmseg - INFO - Iter [86750/160000]	lr: 2.747e-05, eta: 5:49:00, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1345, decode.acc_seg: 94.4924, aux.loss_ce: 0.0774, aux.acc_seg: 92.1480, loss: 0.2119, grad_norm: 1.9094
2023-02-19 10:57:23,876 - mmseg - INFO - Iter [86800/160000]	lr: 2.745e-05, eta: 5:48:45, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1357, decode.acc_seg: 94.4362, aux.loss_ce: 0.0772, aux.acc_seg: 92.0204, loss: 0.2129, grad_norm: 2.2504
2023-02-19 10:57:38,110 - mmseg - INFO - Iter [86850/160000]	lr: 2.743e-05, eta: 5:48:31, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1278, decode.acc_seg: 94.5412, aux.loss_ce: 0.0750, aux.acc_seg: 92.0534, loss: 0.2028, grad_norm: 1.8699
2023-02-19 10:57:52,043 - mmseg - INFO - Iter [86900/160000]	lr: 2.741e-05, eta: 5:48:16, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1255, decode.acc_seg: 94.5288, aux.loss_ce: 0.0710, aux.acc_seg: 92.3370, loss: 0.1965, grad_norm: 2.0357
2023-02-19 10:58:06,459 - mmseg - INFO - Iter [86950/160000]	lr: 2.739e-05, eta: 5:48:02, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1344, decode.acc_seg: 94.3758, aux.loss_ce: 0.0805, aux.acc_seg: 91.9358, loss: 0.2149, grad_norm: 2.0069
2023-02-19 10:58:20,439 - mmseg - INFO - Saving checkpoint at 87000 iterations
2023-02-19 10:58:23,928 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 10:58:23,928 - mmseg - INFO - Iter [87000/160000]	lr: 2.738e-05, eta: 5:47:50, time: 0.350, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1357, decode.acc_seg: 94.0129, aux.loss_ce: 0.0777, aux.acc_seg: 91.6598, loss: 0.2134, grad_norm: 2.0906
2023-02-19 10:58:38,129 - mmseg - INFO - Iter [87050/160000]	lr: 2.736e-05, eta: 5:47:36, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1265, decode.acc_seg: 94.5250, aux.loss_ce: 0.0740, aux.acc_seg: 92.0945, loss: 0.2005, grad_norm: 1.9072
2023-02-19 10:58:51,798 - mmseg - INFO - Iter [87100/160000]	lr: 2.734e-05, eta: 5:47:21, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1314, decode.acc_seg: 94.4235, aux.loss_ce: 0.0745, aux.acc_seg: 92.2760, loss: 0.2059, grad_norm: 1.9683
2023-02-19 10:59:07,769 - mmseg - INFO - Iter [87150/160000]	lr: 2.732e-05, eta: 5:47:08, time: 0.319, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1357, decode.acc_seg: 94.1129, aux.loss_ce: 0.0777, aux.acc_seg: 91.6929, loss: 0.2134, grad_norm: 2.1025
2023-02-19 10:59:21,283 - mmseg - INFO - Iter [87200/160000]	lr: 2.730e-05, eta: 5:46:53, time: 0.270, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1297, decode.acc_seg: 94.5641, aux.loss_ce: 0.0742, aux.acc_seg: 92.3438, loss: 0.2039, grad_norm: 1.4480
2023-02-19 10:59:35,663 - mmseg - INFO - Iter [87250/160000]	lr: 2.728e-05, eta: 5:46:39, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1267, decode.acc_seg: 94.6103, aux.loss_ce: 0.0725, aux.acc_seg: 92.4030, loss: 0.1992, grad_norm: 1.7267
2023-02-19 10:59:49,346 - mmseg - INFO - Iter [87300/160000]	lr: 2.726e-05, eta: 5:46:24, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1306, decode.acc_seg: 94.3655, aux.loss_ce: 0.0754, aux.acc_seg: 92.0673, loss: 0.2060, grad_norm: 1.8265
2023-02-19 11:00:03,488 - mmseg - INFO - Iter [87350/160000]	lr: 2.724e-05, eta: 5:46:10, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1220, decode.acc_seg: 94.7506, aux.loss_ce: 0.0709, aux.acc_seg: 92.5099, loss: 0.1929, grad_norm: 1.6834
2023-02-19 11:00:17,397 - mmseg - INFO - Iter [87400/160000]	lr: 2.723e-05, eta: 5:45:55, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1322, decode.acc_seg: 94.3737, aux.loss_ce: 0.0747, aux.acc_seg: 92.1882, loss: 0.2069, grad_norm: 1.5938
2023-02-19 11:00:32,241 - mmseg - INFO - Iter [87450/160000]	lr: 2.721e-05, eta: 5:45:41, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1346, decode.acc_seg: 94.2665, aux.loss_ce: 0.0767, aux.acc_seg: 92.0349, loss: 0.2113, grad_norm: 1.9350
2023-02-19 11:00:46,370 - mmseg - INFO - Iter [87500/160000]	lr: 2.719e-05, eta: 5:45:27, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1279, decode.acc_seg: 94.5485, aux.loss_ce: 0.0756, aux.acc_seg: 92.0223, loss: 0.2035, grad_norm: 1.5851
2023-02-19 11:01:00,453 - mmseg - INFO - Iter [87550/160000]	lr: 2.717e-05, eta: 5:45:13, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1469, decode.acc_seg: 94.2499, aux.loss_ce: 0.0809, aux.acc_seg: 92.0541, loss: 0.2278, grad_norm: 2.4133
2023-02-19 11:01:14,093 - mmseg - INFO - Iter [87600/160000]	lr: 2.715e-05, eta: 5:44:58, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1307, decode.acc_seg: 94.3839, aux.loss_ce: 0.0748, aux.acc_seg: 92.0402, loss: 0.2055, grad_norm: 1.7606
2023-02-19 11:01:27,998 - mmseg - INFO - Iter [87650/160000]	lr: 2.713e-05, eta: 5:44:43, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1213, decode.acc_seg: 94.8914, aux.loss_ce: 0.0728, aux.acc_seg: 92.4533, loss: 0.1941, grad_norm: 2.4566
2023-02-19 11:01:41,823 - mmseg - INFO - Iter [87700/160000]	lr: 2.711e-05, eta: 5:44:28, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1243, decode.acc_seg: 94.5865, aux.loss_ce: 0.0721, aux.acc_seg: 92.2863, loss: 0.1964, grad_norm: 1.6814
2023-02-19 11:01:55,599 - mmseg - INFO - Iter [87750/160000]	lr: 2.709e-05, eta: 5:44:14, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1354, decode.acc_seg: 94.2134, aux.loss_ce: 0.0759, aux.acc_seg: 91.9176, loss: 0.2113, grad_norm: 1.8460
2023-02-19 11:02:09,478 - mmseg - INFO - Iter [87800/160000]	lr: 2.708e-05, eta: 5:43:59, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1367, decode.acc_seg: 94.2122, aux.loss_ce: 0.0804, aux.acc_seg: 91.6702, loss: 0.2171, grad_norm: 2.4806
2023-02-19 11:02:23,594 - mmseg - INFO - Iter [87850/160000]	lr: 2.706e-05, eta: 5:43:45, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1311, decode.acc_seg: 94.5410, aux.loss_ce: 0.0766, aux.acc_seg: 92.0607, loss: 0.2077, grad_norm: 2.0737
2023-02-19 11:02:37,539 - mmseg - INFO - Iter [87900/160000]	lr: 2.704e-05, eta: 5:43:30, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1252, decode.acc_seg: 94.6293, aux.loss_ce: 0.0715, aux.acc_seg: 92.4633, loss: 0.1967, grad_norm: 1.7957
2023-02-19 11:02:52,024 - mmseg - INFO - Iter [87950/160000]	lr: 2.702e-05, eta: 5:43:16, time: 0.290, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1321, decode.acc_seg: 94.3531, aux.loss_ce: 0.0771, aux.acc_seg: 91.9219, loss: 0.2091, grad_norm: 2.4651
2023-02-19 11:03:06,357 - mmseg - INFO - Saving checkpoint at 88000 iterations
2023-02-19 11:03:09,652 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:03:09,652 - mmseg - INFO - Iter [88000/160000]	lr: 2.700e-05, eta: 5:43:04, time: 0.353, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1296, decode.acc_seg: 94.6316, aux.loss_ce: 0.0760, aux.acc_seg: 92.2723, loss: 0.2056, grad_norm: 1.9953
2023-02-19 11:03:23,570 - mmseg - INFO - Iter [88050/160000]	lr: 2.698e-05, eta: 5:42:50, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1190, decode.acc_seg: 94.7506, aux.loss_ce: 0.0697, aux.acc_seg: 92.5671, loss: 0.1887, grad_norm: 1.7282
2023-02-19 11:03:37,283 - mmseg - INFO - Iter [88100/160000]	lr: 2.696e-05, eta: 5:42:35, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1301, decode.acc_seg: 94.4333, aux.loss_ce: 0.0742, aux.acc_seg: 92.2323, loss: 0.2043, grad_norm: 1.7559
2023-02-19 11:03:51,483 - mmseg - INFO - Iter [88150/160000]	lr: 2.694e-05, eta: 5:42:21, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1288, decode.acc_seg: 94.4883, aux.loss_ce: 0.0747, aux.acc_seg: 92.0290, loss: 0.2035, grad_norm: 1.8806
2023-02-19 11:04:05,449 - mmseg - INFO - Iter [88200/160000]	lr: 2.693e-05, eta: 5:42:06, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1361, decode.acc_seg: 94.1605, aux.loss_ce: 0.0772, aux.acc_seg: 91.9078, loss: 0.2133, grad_norm: 1.8559
2023-02-19 11:04:19,998 - mmseg - INFO - Iter [88250/160000]	lr: 2.691e-05, eta: 5:41:52, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1208, decode.acc_seg: 94.7987, aux.loss_ce: 0.0701, aux.acc_seg: 92.4852, loss: 0.1908, grad_norm: 1.5538
2023-02-19 11:04:34,144 - mmseg - INFO - Iter [88300/160000]	lr: 2.689e-05, eta: 5:41:38, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1314, decode.acc_seg: 94.4523, aux.loss_ce: 0.0809, aux.acc_seg: 91.6546, loss: 0.2123, grad_norm: 2.5094
2023-02-19 11:04:47,723 - mmseg - INFO - Iter [88350/160000]	lr: 2.687e-05, eta: 5:41:23, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1223, decode.acc_seg: 94.8108, aux.loss_ce: 0.0741, aux.acc_seg: 92.0891, loss: 0.1963, grad_norm: 1.7548
2023-02-19 11:05:01,614 - mmseg - INFO - Iter [88400/160000]	lr: 2.685e-05, eta: 5:41:08, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1277, decode.acc_seg: 94.5010, aux.loss_ce: 0.0743, aux.acc_seg: 92.1424, loss: 0.2020, grad_norm: 1.6598
2023-02-19 11:05:17,817 - mmseg - INFO - Iter [88450/160000]	lr: 2.683e-05, eta: 5:40:55, time: 0.324, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1222, decode.acc_seg: 94.7964, aux.loss_ce: 0.0732, aux.acc_seg: 92.3829, loss: 0.1953, grad_norm: 1.5743
2023-02-19 11:05:31,986 - mmseg - INFO - Iter [88500/160000]	lr: 2.681e-05, eta: 5:40:41, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1215, decode.acc_seg: 94.7648, aux.loss_ce: 0.0718, aux.acc_seg: 92.4080, loss: 0.1933, grad_norm: 1.6244
2023-02-19 11:05:45,666 - mmseg - INFO - Iter [88550/160000]	lr: 2.679e-05, eta: 5:40:26, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1278, decode.acc_seg: 94.4784, aux.loss_ce: 0.0753, aux.acc_seg: 91.9395, loss: 0.2031, grad_norm: 1.9632
2023-02-19 11:05:59,451 - mmseg - INFO - Iter [88600/160000]	lr: 2.678e-05, eta: 5:40:11, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1284, decode.acc_seg: 94.5891, aux.loss_ce: 0.0747, aux.acc_seg: 92.2985, loss: 0.2031, grad_norm: 1.7669
2023-02-19 11:06:13,307 - mmseg - INFO - Iter [88650/160000]	lr: 2.676e-05, eta: 5:39:57, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1259, decode.acc_seg: 94.7462, aux.loss_ce: 0.0753, aux.acc_seg: 92.2015, loss: 0.2012, grad_norm: 1.7801
2023-02-19 11:06:27,312 - mmseg - INFO - Iter [88700/160000]	lr: 2.674e-05, eta: 5:39:42, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1286, decode.acc_seg: 94.6467, aux.loss_ce: 0.0738, aux.acc_seg: 92.3537, loss: 0.2024, grad_norm: 1.6026
2023-02-19 11:06:41,165 - mmseg - INFO - Iter [88750/160000]	lr: 2.672e-05, eta: 5:39:28, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1284, decode.acc_seg: 94.3230, aux.loss_ce: 0.0743, aux.acc_seg: 92.0735, loss: 0.2027, grad_norm: 1.7417
2023-02-19 11:06:54,979 - mmseg - INFO - Iter [88800/160000]	lr: 2.670e-05, eta: 5:39:13, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1314, decode.acc_seg: 94.3727, aux.loss_ce: 0.0781, aux.acc_seg: 91.8160, loss: 0.2094, grad_norm: 1.9551
2023-02-19 11:07:08,775 - mmseg - INFO - Iter [88850/160000]	lr: 2.668e-05, eta: 5:38:58, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1227, decode.acc_seg: 94.7493, aux.loss_ce: 0.0738, aux.acc_seg: 92.2649, loss: 0.1966, grad_norm: 1.6521
2023-02-19 11:07:22,449 - mmseg - INFO - Iter [88900/160000]	lr: 2.666e-05, eta: 5:38:43, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1241, decode.acc_seg: 94.6834, aux.loss_ce: 0.0758, aux.acc_seg: 92.0052, loss: 0.1998, grad_norm: 1.6989
2023-02-19 11:07:36,149 - mmseg - INFO - Iter [88950/160000]	lr: 2.664e-05, eta: 5:38:29, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1190, decode.acc_seg: 94.9525, aux.loss_ce: 0.0735, aux.acc_seg: 92.3412, loss: 0.1925, grad_norm: 1.4522
2023-02-19 11:07:49,817 - mmseg - INFO - Saving checkpoint at 89000 iterations
2023-02-19 11:07:53,118 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:07:53,118 - mmseg - INFO - Iter [89000/160000]	lr: 2.663e-05, eta: 5:38:17, time: 0.340, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1240, decode.acc_seg: 94.6769, aux.loss_ce: 0.0734, aux.acc_seg: 92.3262, loss: 0.1974, grad_norm: 1.9750
2023-02-19 11:08:06,728 - mmseg - INFO - Iter [89050/160000]	lr: 2.661e-05, eta: 5:38:02, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1277, decode.acc_seg: 94.6523, aux.loss_ce: 0.0778, aux.acc_seg: 92.0496, loss: 0.2054, grad_norm: 2.0804
2023-02-19 11:08:20,493 - mmseg - INFO - Iter [89100/160000]	lr: 2.659e-05, eta: 5:37:47, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1292, decode.acc_seg: 94.4974, aux.loss_ce: 0.0752, aux.acc_seg: 92.1629, loss: 0.2045, grad_norm: 1.6337
2023-02-19 11:08:34,418 - mmseg - INFO - Iter [89150/160000]	lr: 2.657e-05, eta: 5:37:32, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1199, decode.acc_seg: 94.8448, aux.loss_ce: 0.0700, aux.acc_seg: 92.5734, loss: 0.1899, grad_norm: 1.8695
2023-02-19 11:08:48,166 - mmseg - INFO - Iter [89200/160000]	lr: 2.655e-05, eta: 5:37:18, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1230, decode.acc_seg: 94.7127, aux.loss_ce: 0.0704, aux.acc_seg: 92.5241, loss: 0.1934, grad_norm: 1.4834
2023-02-19 11:09:03,246 - mmseg - INFO - Iter [89250/160000]	lr: 2.653e-05, eta: 5:37:04, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1233, decode.acc_seg: 94.7805, aux.loss_ce: 0.0719, aux.acc_seg: 92.5673, loss: 0.1952, grad_norm: 1.4798
2023-02-19 11:09:17,207 - mmseg - INFO - Iter [89300/160000]	lr: 2.651e-05, eta: 5:36:49, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1309, decode.acc_seg: 94.3983, aux.loss_ce: 0.0758, aux.acc_seg: 91.8651, loss: 0.2067, grad_norm: 1.6239
2023-02-19 11:09:30,991 - mmseg - INFO - Iter [89350/160000]	lr: 2.649e-05, eta: 5:36:35, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1245, decode.acc_seg: 94.6231, aux.loss_ce: 0.0708, aux.acc_seg: 92.4348, loss: 0.1953, grad_norm: 1.7686
2023-02-19 11:09:45,047 - mmseg - INFO - Iter [89400/160000]	lr: 2.648e-05, eta: 5:36:20, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1272, decode.acc_seg: 94.5119, aux.loss_ce: 0.0748, aux.acc_seg: 92.0421, loss: 0.2021, grad_norm: 1.9452
2023-02-19 11:09:59,470 - mmseg - INFO - Iter [89450/160000]	lr: 2.646e-05, eta: 5:36:06, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1318, decode.acc_seg: 94.4051, aux.loss_ce: 0.0757, aux.acc_seg: 92.0494, loss: 0.2075, grad_norm: 1.6515
2023-02-19 11:10:13,267 - mmseg - INFO - Iter [89500/160000]	lr: 2.644e-05, eta: 5:35:51, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1328, decode.acc_seg: 94.4637, aux.loss_ce: 0.0768, aux.acc_seg: 92.0252, loss: 0.2096, grad_norm: 2.0439
2023-02-19 11:10:27,094 - mmseg - INFO - Iter [89550/160000]	lr: 2.642e-05, eta: 5:35:37, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1266, decode.acc_seg: 94.6449, aux.loss_ce: 0.0736, aux.acc_seg: 92.2787, loss: 0.2002, grad_norm: 1.7008
2023-02-19 11:10:41,208 - mmseg - INFO - Iter [89600/160000]	lr: 2.640e-05, eta: 5:35:22, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1195, decode.acc_seg: 94.8280, aux.loss_ce: 0.0710, aux.acc_seg: 92.5169, loss: 0.1905, grad_norm: 1.7634
2023-02-19 11:10:54,854 - mmseg - INFO - Iter [89650/160000]	lr: 2.638e-05, eta: 5:35:08, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1335, decode.acc_seg: 94.4604, aux.loss_ce: 0.0808, aux.acc_seg: 92.1210, loss: 0.2143, grad_norm: 2.2115
2023-02-19 11:11:10,748 - mmseg - INFO - Iter [89700/160000]	lr: 2.636e-05, eta: 5:34:54, time: 0.318, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1274, decode.acc_seg: 94.4433, aux.loss_ce: 0.0756, aux.acc_seg: 91.9722, loss: 0.2030, grad_norm: 1.9849
2023-02-19 11:11:24,435 - mmseg - INFO - Iter [89750/160000]	lr: 2.634e-05, eta: 5:34:40, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1292, decode.acc_seg: 94.4339, aux.loss_ce: 0.0735, aux.acc_seg: 92.1646, loss: 0.2027, grad_norm: 1.6703
2023-02-19 11:11:38,053 - mmseg - INFO - Iter [89800/160000]	lr: 2.633e-05, eta: 5:34:25, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1293, decode.acc_seg: 94.4791, aux.loss_ce: 0.0760, aux.acc_seg: 92.0834, loss: 0.2053, grad_norm: 2.0567
2023-02-19 11:11:51,722 - mmseg - INFO - Iter [89850/160000]	lr: 2.631e-05, eta: 5:34:10, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1172, decode.acc_seg: 94.8639, aux.loss_ce: 0.0682, aux.acc_seg: 92.7060, loss: 0.1854, grad_norm: 1.3065
2023-02-19 11:12:05,550 - mmseg - INFO - Iter [89900/160000]	lr: 2.629e-05, eta: 5:33:55, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1253, decode.acc_seg: 94.7305, aux.loss_ce: 0.0730, aux.acc_seg: 92.3037, loss: 0.1983, grad_norm: 1.5098
2023-02-19 11:12:19,307 - mmseg - INFO - Iter [89950/160000]	lr: 2.627e-05, eta: 5:33:41, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1187, decode.acc_seg: 94.8309, aux.loss_ce: 0.0720, aux.acc_seg: 92.3795, loss: 0.1907, grad_norm: 1.9987
2023-02-19 11:12:33,106 - mmseg - INFO - Saving checkpoint at 90000 iterations
2023-02-19 11:12:36,397 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:12:36,397 - mmseg - INFO - Iter [90000/160000]	lr: 2.625e-05, eta: 5:33:29, time: 0.342, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1287, decode.acc_seg: 94.5075, aux.loss_ce: 0.0740, aux.acc_seg: 92.2311, loss: 0.2027, grad_norm: 2.0807
2023-02-19 11:12:50,724 - mmseg - INFO - Iter [90050/160000]	lr: 2.623e-05, eta: 5:33:14, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1252, decode.acc_seg: 94.5522, aux.loss_ce: 0.0738, aux.acc_seg: 92.1499, loss: 0.1990, grad_norm: 2.2766
2023-02-19 11:13:04,910 - mmseg - INFO - Iter [90100/160000]	lr: 2.621e-05, eta: 5:33:00, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1294, decode.acc_seg: 94.5071, aux.loss_ce: 0.0748, aux.acc_seg: 92.0966, loss: 0.2042, grad_norm: 2.1758
2023-02-19 11:13:19,652 - mmseg - INFO - Iter [90150/160000]	lr: 2.619e-05, eta: 5:32:46, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1220, decode.acc_seg: 94.7649, aux.loss_ce: 0.0698, aux.acc_seg: 92.6394, loss: 0.1917, grad_norm: 1.7089
2023-02-19 11:13:33,522 - mmseg - INFO - Iter [90200/160000]	lr: 2.618e-05, eta: 5:32:31, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1284, decode.acc_seg: 94.3109, aux.loss_ce: 0.0732, aux.acc_seg: 92.0081, loss: 0.2017, grad_norm: 2.4484
2023-02-19 11:13:47,182 - mmseg - INFO - Iter [90250/160000]	lr: 2.616e-05, eta: 5:32:17, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1275, decode.acc_seg: 94.6616, aux.loss_ce: 0.0729, aux.acc_seg: 92.4604, loss: 0.2005, grad_norm: 2.3154
2023-02-19 11:14:01,361 - mmseg - INFO - Iter [90300/160000]	lr: 2.614e-05, eta: 5:32:02, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1199, decode.acc_seg: 94.9107, aux.loss_ce: 0.0717, aux.acc_seg: 92.5795, loss: 0.1916, grad_norm: 1.5918
2023-02-19 11:14:15,488 - mmseg - INFO - Iter [90350/160000]	lr: 2.612e-05, eta: 5:31:48, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1258, decode.acc_seg: 94.6937, aux.loss_ce: 0.0724, aux.acc_seg: 92.2738, loss: 0.1982, grad_norm: 1.5225
2023-02-19 11:14:29,679 - mmseg - INFO - Iter [90400/160000]	lr: 2.610e-05, eta: 5:31:33, time: 0.284, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1314, decode.acc_seg: 94.5551, aux.loss_ce: 0.0769, aux.acc_seg: 92.1836, loss: 0.2083, grad_norm: 1.7541
2023-02-19 11:14:44,579 - mmseg - INFO - Iter [90450/160000]	lr: 2.608e-05, eta: 5:31:20, time: 0.298, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1256, decode.acc_seg: 94.6678, aux.loss_ce: 0.0734, aux.acc_seg: 92.3247, loss: 0.1990, grad_norm: 1.9281
2023-02-19 11:14:58,658 - mmseg - INFO - Iter [90500/160000]	lr: 2.606e-05, eta: 5:31:05, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1273, decode.acc_seg: 94.5866, aux.loss_ce: 0.0737, aux.acc_seg: 92.2961, loss: 0.2010, grad_norm: 1.5343
2023-02-19 11:15:12,564 - mmseg - INFO - Iter [90550/160000]	lr: 2.604e-05, eta: 5:30:51, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1211, decode.acc_seg: 94.7350, aux.loss_ce: 0.0705, aux.acc_seg: 92.5568, loss: 0.1916, grad_norm: 1.8569
2023-02-19 11:15:26,272 - mmseg - INFO - Iter [90600/160000]	lr: 2.603e-05, eta: 5:30:36, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1280, decode.acc_seg: 94.3896, aux.loss_ce: 0.0744, aux.acc_seg: 92.0384, loss: 0.2024, grad_norm: 1.7877
2023-02-19 11:15:40,742 - mmseg - INFO - Iter [90650/160000]	lr: 2.601e-05, eta: 5:30:22, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1258, decode.acc_seg: 94.5754, aux.loss_ce: 0.0713, aux.acc_seg: 92.4496, loss: 0.1971, grad_norm: 2.0146
2023-02-19 11:15:54,718 - mmseg - INFO - Iter [90700/160000]	lr: 2.599e-05, eta: 5:30:07, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1234, decode.acc_seg: 94.7639, aux.loss_ce: 0.0746, aux.acc_seg: 92.2797, loss: 0.1980, grad_norm: 1.8474
2023-02-19 11:16:08,640 - mmseg - INFO - Iter [90750/160000]	lr: 2.597e-05, eta: 5:29:53, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1204, decode.acc_seg: 94.7654, aux.loss_ce: 0.0713, aux.acc_seg: 92.4192, loss: 0.1917, grad_norm: 1.9027
2023-02-19 11:16:22,631 - mmseg - INFO - Iter [90800/160000]	lr: 2.595e-05, eta: 5:29:38, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1303, decode.acc_seg: 94.5313, aux.loss_ce: 0.0755, aux.acc_seg: 92.3174, loss: 0.2059, grad_norm: 2.1272
2023-02-19 11:16:36,295 - mmseg - INFO - Iter [90850/160000]	lr: 2.593e-05, eta: 5:29:23, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1220, decode.acc_seg: 94.7460, aux.loss_ce: 0.0702, aux.acc_seg: 92.5483, loss: 0.1922, grad_norm: 1.6392
2023-02-19 11:16:50,332 - mmseg - INFO - Iter [90900/160000]	lr: 2.591e-05, eta: 5:29:09, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1212, decode.acc_seg: 94.7600, aux.loss_ce: 0.0702, aux.acc_seg: 92.5361, loss: 0.1914, grad_norm: 1.8081
2023-02-19 11:17:06,152 - mmseg - INFO - Iter [90950/160000]	lr: 2.589e-05, eta: 5:28:56, time: 0.316, data_time: 0.046, memory: 15214, decode.loss_ce: 0.1307, decode.acc_seg: 94.5524, aux.loss_ce: 0.0747, aux.acc_seg: 92.2752, loss: 0.2054, grad_norm: 2.2791
2023-02-19 11:17:19,665 - mmseg - INFO - Saving checkpoint at 91000 iterations
2023-02-19 11:17:22,931 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:17:22,932 - mmseg - INFO - Iter [91000/160000]	lr: 2.588e-05, eta: 5:28:43, time: 0.336, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1260, decode.acc_seg: 94.6058, aux.loss_ce: 0.0732, aux.acc_seg: 92.2364, loss: 0.1991, grad_norm: 1.7305
2023-02-19 11:17:36,931 - mmseg - INFO - Iter [91050/160000]	lr: 2.586e-05, eta: 5:28:29, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1314, decode.acc_seg: 94.3851, aux.loss_ce: 0.0759, aux.acc_seg: 92.0346, loss: 0.2073, grad_norm: 1.6389
2023-02-19 11:17:50,595 - mmseg - INFO - Iter [91100/160000]	lr: 2.584e-05, eta: 5:28:14, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1229, decode.acc_seg: 94.6383, aux.loss_ce: 0.0702, aux.acc_seg: 92.3912, loss: 0.1931, grad_norm: 1.9068
2023-02-19 11:18:04,745 - mmseg - INFO - Iter [91150/160000]	lr: 2.582e-05, eta: 5:28:00, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1246, decode.acc_seg: 94.6417, aux.loss_ce: 0.0723, aux.acc_seg: 92.4019, loss: 0.1969, grad_norm: 1.6404
2023-02-19 11:18:18,872 - mmseg - INFO - Iter [91200/160000]	lr: 2.580e-05, eta: 5:27:45, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1207, decode.acc_seg: 94.8187, aux.loss_ce: 0.0712, aux.acc_seg: 92.4703, loss: 0.1920, grad_norm: 1.9445
2023-02-19 11:18:32,609 - mmseg - INFO - Iter [91250/160000]	lr: 2.578e-05, eta: 5:27:31, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1224, decode.acc_seg: 94.7218, aux.loss_ce: 0.0715, aux.acc_seg: 92.4459, loss: 0.1939, grad_norm: 1.5597
2023-02-19 11:18:46,654 - mmseg - INFO - Iter [91300/160000]	lr: 2.576e-05, eta: 5:27:16, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1227, decode.acc_seg: 94.6795, aux.loss_ce: 0.0720, aux.acc_seg: 92.3867, loss: 0.1947, grad_norm: 1.4322
2023-02-19 11:19:00,366 - mmseg - INFO - Iter [91350/160000]	lr: 2.574e-05, eta: 5:27:01, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1272, decode.acc_seg: 94.4755, aux.loss_ce: 0.0730, aux.acc_seg: 92.1949, loss: 0.2002, grad_norm: 1.5557
2023-02-19 11:19:14,354 - mmseg - INFO - Iter [91400/160000]	lr: 2.573e-05, eta: 5:26:47, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1271, decode.acc_seg: 94.4497, aux.loss_ce: 0.0728, aux.acc_seg: 92.1586, loss: 0.1999, grad_norm: 2.5262
2023-02-19 11:19:27,967 - mmseg - INFO - Iter [91450/160000]	lr: 2.571e-05, eta: 5:26:32, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1272, decode.acc_seg: 94.5911, aux.loss_ce: 0.0755, aux.acc_seg: 92.0900, loss: 0.2028, grad_norm: 2.0812
2023-02-19 11:19:41,970 - mmseg - INFO - Iter [91500/160000]	lr: 2.569e-05, eta: 5:26:17, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1159, decode.acc_seg: 95.0178, aux.loss_ce: 0.0672, aux.acc_seg: 92.9614, loss: 0.1831, grad_norm: 1.4857
2023-02-19 11:19:56,000 - mmseg - INFO - Iter [91550/160000]	lr: 2.567e-05, eta: 5:26:03, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1228, decode.acc_seg: 94.8004, aux.loss_ce: 0.0724, aux.acc_seg: 92.4396, loss: 0.1952, grad_norm: 1.7396
2023-02-19 11:20:09,969 - mmseg - INFO - Iter [91600/160000]	lr: 2.565e-05, eta: 5:25:48, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1250, decode.acc_seg: 94.5958, aux.loss_ce: 0.0740, aux.acc_seg: 92.2588, loss: 0.1990, grad_norm: 1.8630
2023-02-19 11:20:23,560 - mmseg - INFO - Iter [91650/160000]	lr: 2.563e-05, eta: 5:25:34, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1218, decode.acc_seg: 94.7762, aux.loss_ce: 0.0720, aux.acc_seg: 92.4396, loss: 0.1938, grad_norm: 1.9996
2023-02-19 11:20:37,140 - mmseg - INFO - Iter [91700/160000]	lr: 2.561e-05, eta: 5:25:19, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1312, decode.acc_seg: 94.4209, aux.loss_ce: 0.0751, aux.acc_seg: 92.1742, loss: 0.2063, grad_norm: 2.2893
2023-02-19 11:20:50,978 - mmseg - INFO - Iter [91750/160000]	lr: 2.559e-05, eta: 5:25:04, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1216, decode.acc_seg: 94.7124, aux.loss_ce: 0.0725, aux.acc_seg: 92.1921, loss: 0.1941, grad_norm: 2.0371
2023-02-19 11:21:05,033 - mmseg - INFO - Iter [91800/160000]	lr: 2.558e-05, eta: 5:24:50, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1240, decode.acc_seg: 94.8738, aux.loss_ce: 0.0731, aux.acc_seg: 92.5416, loss: 0.1972, grad_norm: 1.9567
2023-02-19 11:21:18,750 - mmseg - INFO - Iter [91850/160000]	lr: 2.556e-05, eta: 5:24:35, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1257, decode.acc_seg: 94.4894, aux.loss_ce: 0.0707, aux.acc_seg: 92.4279, loss: 0.1964, grad_norm: 2.1101
2023-02-19 11:21:33,672 - mmseg - INFO - Iter [91900/160000]	lr: 2.554e-05, eta: 5:24:21, time: 0.298, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1200, decode.acc_seg: 94.7252, aux.loss_ce: 0.0694, aux.acc_seg: 92.5391, loss: 0.1894, grad_norm: 1.6660
2023-02-19 11:21:47,698 - mmseg - INFO - Iter [91950/160000]	lr: 2.552e-05, eta: 5:24:07, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1214, decode.acc_seg: 94.8016, aux.loss_ce: 0.0717, aux.acc_seg: 92.3879, loss: 0.1932, grad_norm: 1.9173
2023-02-19 11:22:01,840 - mmseg - INFO - Saving checkpoint at 92000 iterations
2023-02-19 11:22:05,175 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:22:05,175 - mmseg - INFO - Iter [92000/160000]	lr: 2.550e-05, eta: 5:23:55, time: 0.350, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1252, decode.acc_seg: 94.5478, aux.loss_ce: 0.0735, aux.acc_seg: 92.0663, loss: 0.1987, grad_norm: 1.7462
2023-02-19 11:22:19,677 - mmseg - INFO - Iter [92050/160000]	lr: 2.548e-05, eta: 5:23:41, time: 0.290, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1201, decode.acc_seg: 94.7881, aux.loss_ce: 0.0688, aux.acc_seg: 92.5899, loss: 0.1890, grad_norm: 1.5240
2023-02-19 11:22:34,203 - mmseg - INFO - Iter [92100/160000]	lr: 2.546e-05, eta: 5:23:27, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1286, decode.acc_seg: 94.7103, aux.loss_ce: 0.0742, aux.acc_seg: 92.4774, loss: 0.2028, grad_norm: 1.7209
2023-02-19 11:22:48,005 - mmseg - INFO - Iter [92150/160000]	lr: 2.544e-05, eta: 5:23:12, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1339, decode.acc_seg: 94.4545, aux.loss_ce: 0.0766, aux.acc_seg: 92.1346, loss: 0.2106, grad_norm: 1.6653
2023-02-19 11:23:03,801 - mmseg - INFO - Iter [92200/160000]	lr: 2.543e-05, eta: 5:22:59, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1280, decode.acc_seg: 94.5310, aux.loss_ce: 0.0757, aux.acc_seg: 92.2057, loss: 0.2037, grad_norm: 1.5943
2023-02-19 11:23:17,531 - mmseg - INFO - Iter [92250/160000]	lr: 2.541e-05, eta: 5:22:44, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1230, decode.acc_seg: 94.8148, aux.loss_ce: 0.0743, aux.acc_seg: 92.2155, loss: 0.1973, grad_norm: 1.5889
2023-02-19 11:23:31,281 - mmseg - INFO - Iter [92300/160000]	lr: 2.539e-05, eta: 5:22:29, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1324, decode.acc_seg: 94.5133, aux.loss_ce: 0.0764, aux.acc_seg: 92.1989, loss: 0.2088, grad_norm: 1.8721
2023-02-19 11:23:45,624 - mmseg - INFO - Iter [92350/160000]	lr: 2.537e-05, eta: 5:22:15, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1198, decode.acc_seg: 94.9170, aux.loss_ce: 0.0721, aux.acc_seg: 92.4461, loss: 0.1919, grad_norm: 1.9364
2023-02-19 11:23:59,516 - mmseg - INFO - Iter [92400/160000]	lr: 2.535e-05, eta: 5:22:00, time: 0.279, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1205, decode.acc_seg: 94.6775, aux.loss_ce: 0.0711, aux.acc_seg: 92.3087, loss: 0.1916, grad_norm: 1.9822
2023-02-19 11:24:13,041 - mmseg - INFO - Iter [92450/160000]	lr: 2.533e-05, eta: 5:21:46, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1282, decode.acc_seg: 94.5699, aux.loss_ce: 0.0753, aux.acc_seg: 92.0189, loss: 0.2036, grad_norm: 1.6903
2023-02-19 11:24:27,231 - mmseg - INFO - Iter [92500/160000]	lr: 2.531e-05, eta: 5:21:31, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1269, decode.acc_seg: 94.5438, aux.loss_ce: 0.0748, aux.acc_seg: 92.0700, loss: 0.2017, grad_norm: 1.7907
2023-02-19 11:24:41,021 - mmseg - INFO - Iter [92550/160000]	lr: 2.529e-05, eta: 5:21:17, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1212, decode.acc_seg: 94.7400, aux.loss_ce: 0.0697, aux.acc_seg: 92.5759, loss: 0.1908, grad_norm: 1.5124
2023-02-19 11:24:54,997 - mmseg - INFO - Iter [92600/160000]	lr: 2.528e-05, eta: 5:21:02, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1312, decode.acc_seg: 94.4049, aux.loss_ce: 0.0767, aux.acc_seg: 91.9421, loss: 0.2080, grad_norm: 2.0450
2023-02-19 11:25:09,019 - mmseg - INFO - Iter [92650/160000]	lr: 2.526e-05, eta: 5:20:48, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1258, decode.acc_seg: 94.6608, aux.loss_ce: 0.0731, aux.acc_seg: 92.3655, loss: 0.1989, grad_norm: 1.5688
2023-02-19 11:25:23,053 - mmseg - INFO - Iter [92700/160000]	lr: 2.524e-05, eta: 5:20:33, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1214, decode.acc_seg: 94.6937, aux.loss_ce: 0.0704, aux.acc_seg: 92.6263, loss: 0.1917, grad_norm: 1.6841
2023-02-19 11:25:36,922 - mmseg - INFO - Iter [92750/160000]	lr: 2.522e-05, eta: 5:20:19, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1194, decode.acc_seg: 94.8907, aux.loss_ce: 0.0721, aux.acc_seg: 92.4035, loss: 0.1915, grad_norm: 2.0135
2023-02-19 11:25:50,854 - mmseg - INFO - Iter [92800/160000]	lr: 2.520e-05, eta: 5:20:04, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1259, decode.acc_seg: 94.5955, aux.loss_ce: 0.0722, aux.acc_seg: 92.2333, loss: 0.1981, grad_norm: 1.9508
2023-02-19 11:26:05,106 - mmseg - INFO - Iter [92850/160000]	lr: 2.518e-05, eta: 5:19:50, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1216, decode.acc_seg: 94.9083, aux.loss_ce: 0.0729, aux.acc_seg: 92.3686, loss: 0.1945, grad_norm: 1.9270
2023-02-19 11:26:19,686 - mmseg - INFO - Iter [92900/160000]	lr: 2.516e-05, eta: 5:19:36, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1154, decode.acc_seg: 95.0449, aux.loss_ce: 0.0688, aux.acc_seg: 92.8319, loss: 0.1842, grad_norm: 1.6301
2023-02-19 11:26:34,195 - mmseg - INFO - Iter [92950/160000]	lr: 2.514e-05, eta: 5:19:21, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1199, decode.acc_seg: 94.6884, aux.loss_ce: 0.0741, aux.acc_seg: 92.0585, loss: 0.1939, grad_norm: 2.1164
2023-02-19 11:26:47,860 - mmseg - INFO - Saving checkpoint at 93000 iterations
2023-02-19 11:26:51,094 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:26:51,094 - mmseg - INFO - Iter [93000/160000]	lr: 2.513e-05, eta: 5:19:09, time: 0.338, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1177, decode.acc_seg: 95.0406, aux.loss_ce: 0.0679, aux.acc_seg: 92.8581, loss: 0.1856, grad_norm: 1.7453
2023-02-19 11:27:05,099 - mmseg - INFO - Iter [93050/160000]	lr: 2.511e-05, eta: 5:18:55, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1175, decode.acc_seg: 94.9507, aux.loss_ce: 0.0702, aux.acc_seg: 92.5458, loss: 0.1878, grad_norm: 1.9756
2023-02-19 11:27:19,060 - mmseg - INFO - Iter [93100/160000]	lr: 2.509e-05, eta: 5:18:40, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1311, decode.acc_seg: 94.4288, aux.loss_ce: 0.0765, aux.acc_seg: 92.0637, loss: 0.2076, grad_norm: 2.1208
2023-02-19 11:27:33,155 - mmseg - INFO - Iter [93150/160000]	lr: 2.507e-05, eta: 5:18:26, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1285, decode.acc_seg: 94.5973, aux.loss_ce: 0.0743, aux.acc_seg: 92.2804, loss: 0.2028, grad_norm: 2.0749
2023-02-19 11:27:46,869 - mmseg - INFO - Iter [93200/160000]	lr: 2.505e-05, eta: 5:18:11, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1187, decode.acc_seg: 94.8096, aux.loss_ce: 0.0675, aux.acc_seg: 92.7220, loss: 0.1862, grad_norm: 1.7335
2023-02-19 11:28:01,721 - mmseg - INFO - Iter [93250/160000]	lr: 2.503e-05, eta: 5:17:57, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1250, decode.acc_seg: 94.8150, aux.loss_ce: 0.0726, aux.acc_seg: 92.5047, loss: 0.1976, grad_norm: 2.1161
2023-02-19 11:28:16,374 - mmseg - INFO - Iter [93300/160000]	lr: 2.501e-05, eta: 5:17:43, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1238, decode.acc_seg: 94.7976, aux.loss_ce: 0.0716, aux.acc_seg: 92.4987, loss: 0.1954, grad_norm: 1.9067
2023-02-19 11:28:30,689 - mmseg - INFO - Iter [93350/160000]	lr: 2.499e-05, eta: 5:17:29, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1244, decode.acc_seg: 94.6609, aux.loss_ce: 0.0740, aux.acc_seg: 92.1455, loss: 0.1984, grad_norm: 1.8264
2023-02-19 11:28:44,684 - mmseg - INFO - Iter [93400/160000]	lr: 2.498e-05, eta: 5:17:14, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1220, decode.acc_seg: 94.8002, aux.loss_ce: 0.0717, aux.acc_seg: 92.3595, loss: 0.1937, grad_norm: 1.8094
2023-02-19 11:28:59,117 - mmseg - INFO - Iter [93450/160000]	lr: 2.496e-05, eta: 5:17:00, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1243, decode.acc_seg: 94.5295, aux.loss_ce: 0.0724, aux.acc_seg: 92.1992, loss: 0.1967, grad_norm: 1.7301
2023-02-19 11:29:15,065 - mmseg - INFO - Iter [93500/160000]	lr: 2.494e-05, eta: 5:16:47, time: 0.319, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1249, decode.acc_seg: 94.7133, aux.loss_ce: 0.0725, aux.acc_seg: 92.5048, loss: 0.1974, grad_norm: 1.8143
2023-02-19 11:29:28,675 - mmseg - INFO - Iter [93550/160000]	lr: 2.492e-05, eta: 5:16:32, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1182, decode.acc_seg: 94.8296, aux.loss_ce: 0.0683, aux.acc_seg: 92.6130, loss: 0.1865, grad_norm: 1.4807
2023-02-19 11:29:43,178 - mmseg - INFO - Iter [93600/160000]	lr: 2.490e-05, eta: 5:16:18, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1181, decode.acc_seg: 94.9643, aux.loss_ce: 0.0735, aux.acc_seg: 92.2907, loss: 0.1915, grad_norm: 1.5584
2023-02-19 11:29:56,881 - mmseg - INFO - Iter [93650/160000]	lr: 2.488e-05, eta: 5:16:03, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1166, decode.acc_seg: 94.9978, aux.loss_ce: 0.0695, aux.acc_seg: 92.6950, loss: 0.1861, grad_norm: 1.6118
2023-02-19 11:30:10,672 - mmseg - INFO - Iter [93700/160000]	lr: 2.486e-05, eta: 5:15:49, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1169, decode.acc_seg: 94.9140, aux.loss_ce: 0.0682, aux.acc_seg: 92.6828, loss: 0.1851, grad_norm: 1.4018
2023-02-19 11:30:24,651 - mmseg - INFO - Iter [93750/160000]	lr: 2.484e-05, eta: 5:15:34, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1271, decode.acc_seg: 94.6847, aux.loss_ce: 0.0734, aux.acc_seg: 92.3312, loss: 0.2005, grad_norm: 1.7898
2023-02-19 11:30:38,487 - mmseg - INFO - Iter [93800/160000]	lr: 2.483e-05, eta: 5:15:20, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1269, decode.acc_seg: 94.4609, aux.loss_ce: 0.0730, aux.acc_seg: 92.1780, loss: 0.2000, grad_norm: 1.7532
2023-02-19 11:30:52,620 - mmseg - INFO - Iter [93850/160000]	lr: 2.481e-05, eta: 5:15:05, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1143, decode.acc_seg: 95.0714, aux.loss_ce: 0.0689, aux.acc_seg: 92.7258, loss: 0.1832, grad_norm: 1.2724
2023-02-19 11:31:06,266 - mmseg - INFO - Iter [93900/160000]	lr: 2.479e-05, eta: 5:14:50, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1255, decode.acc_seg: 94.6786, aux.loss_ce: 0.0717, aux.acc_seg: 92.5375, loss: 0.1972, grad_norm: 1.8157
2023-02-19 11:31:20,419 - mmseg - INFO - Iter [93950/160000]	lr: 2.477e-05, eta: 5:14:36, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1257, decode.acc_seg: 94.5092, aux.loss_ce: 0.0727, aux.acc_seg: 92.2572, loss: 0.1984, grad_norm: 2.0963
2023-02-19 11:31:34,853 - mmseg - INFO - Saving checkpoint at 94000 iterations
2023-02-19 11:31:38,121 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:31:38,121 - mmseg - INFO - Iter [94000/160000]	lr: 2.475e-05, eta: 5:14:24, time: 0.354, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1234, decode.acc_seg: 94.7330, aux.loss_ce: 0.0729, aux.acc_seg: 92.2014, loss: 0.1963, grad_norm: 1.7020
2023-02-19 11:31:52,270 - mmseg - INFO - Iter [94050/160000]	lr: 2.473e-05, eta: 5:14:10, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1249, decode.acc_seg: 94.5400, aux.loss_ce: 0.0706, aux.acc_seg: 92.4009, loss: 0.1955, grad_norm: 1.6163
2023-02-19 11:32:06,057 - mmseg - INFO - Iter [94100/160000]	lr: 2.471e-05, eta: 5:13:55, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1223, decode.acc_seg: 94.6577, aux.loss_ce: 0.0716, aux.acc_seg: 92.4197, loss: 0.1939, grad_norm: 1.9502
2023-02-19 11:32:20,435 - mmseg - INFO - Iter [94150/160000]	lr: 2.469e-05, eta: 5:13:41, time: 0.288, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1197, decode.acc_seg: 94.8967, aux.loss_ce: 0.0703, aux.acc_seg: 92.6677, loss: 0.1899, grad_norm: 1.6256
2023-02-19 11:32:35,331 - mmseg - INFO - Iter [94200/160000]	lr: 2.468e-05, eta: 5:13:27, time: 0.298, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1239, decode.acc_seg: 94.6539, aux.loss_ce: 0.0743, aux.acc_seg: 92.1148, loss: 0.1981, grad_norm: 1.9881
2023-02-19 11:32:49,023 - mmseg - INFO - Iter [94250/160000]	lr: 2.466e-05, eta: 5:13:12, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1245, decode.acc_seg: 94.6895, aux.loss_ce: 0.0699, aux.acc_seg: 92.6208, loss: 0.1944, grad_norm: 1.6924
2023-02-19 11:33:03,431 - mmseg - INFO - Iter [94300/160000]	lr: 2.464e-05, eta: 5:12:58, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1284, decode.acc_seg: 94.5301, aux.loss_ce: 0.0757, aux.acc_seg: 92.2387, loss: 0.2042, grad_norm: 1.9291
2023-02-19 11:33:17,458 - mmseg - INFO - Iter [94350/160000]	lr: 2.462e-05, eta: 5:12:44, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1319, decode.acc_seg: 94.5478, aux.loss_ce: 0.0749, aux.acc_seg: 92.3168, loss: 0.2067, grad_norm: 1.9263
2023-02-19 11:33:31,374 - mmseg - INFO - Iter [94400/160000]	lr: 2.460e-05, eta: 5:12:29, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1187, decode.acc_seg: 94.9315, aux.loss_ce: 0.0710, aux.acc_seg: 92.4972, loss: 0.1897, grad_norm: 2.0634
2023-02-19 11:33:45,106 - mmseg - INFO - Iter [94450/160000]	lr: 2.458e-05, eta: 5:12:14, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1255, decode.acc_seg: 94.5667, aux.loss_ce: 0.0718, aux.acc_seg: 92.3715, loss: 0.1973, grad_norm: 2.0795
2023-02-19 11:33:58,803 - mmseg - INFO - Iter [94500/160000]	lr: 2.456e-05, eta: 5:12:00, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1199, decode.acc_seg: 94.7813, aux.loss_ce: 0.0699, aux.acc_seg: 92.5209, loss: 0.1898, grad_norm: 1.4235
2023-02-19 11:34:12,609 - mmseg - INFO - Iter [94550/160000]	lr: 2.454e-05, eta: 5:11:45, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1198, decode.acc_seg: 94.7910, aux.loss_ce: 0.0700, aux.acc_seg: 92.5406, loss: 0.1899, grad_norm: 1.6368
2023-02-19 11:34:26,345 - mmseg - INFO - Iter [94600/160000]	lr: 2.453e-05, eta: 5:11:30, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1280, decode.acc_seg: 94.5366, aux.loss_ce: 0.0759, aux.acc_seg: 91.9454, loss: 0.2039, grad_norm: 1.6178
2023-02-19 11:34:40,136 - mmseg - INFO - Iter [94650/160000]	lr: 2.451e-05, eta: 5:11:16, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1173, decode.acc_seg: 94.8798, aux.loss_ce: 0.0687, aux.acc_seg: 92.7306, loss: 0.1860, grad_norm: 1.8046
2023-02-19 11:34:54,342 - mmseg - INFO - Iter [94700/160000]	lr: 2.449e-05, eta: 5:11:01, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1177, decode.acc_seg: 94.8820, aux.loss_ce: 0.0692, aux.acc_seg: 92.6238, loss: 0.1869, grad_norm: 1.5475
2023-02-19 11:35:10,721 - mmseg - INFO - Iter [94750/160000]	lr: 2.447e-05, eta: 5:10:49, time: 0.328, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1222, decode.acc_seg: 94.8012, aux.loss_ce: 0.0710, aux.acc_seg: 92.5368, loss: 0.1931, grad_norm: 1.9994
2023-02-19 11:35:24,913 - mmseg - INFO - Iter [94800/160000]	lr: 2.445e-05, eta: 5:10:34, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1240, decode.acc_seg: 94.6601, aux.loss_ce: 0.0706, aux.acc_seg: 92.5973, loss: 0.1946, grad_norm: 1.6809
2023-02-19 11:35:38,802 - mmseg - INFO - Iter [94850/160000]	lr: 2.443e-05, eta: 5:10:20, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1183, decode.acc_seg: 94.8698, aux.loss_ce: 0.0707, aux.acc_seg: 92.4969, loss: 0.1890, grad_norm: 1.4063
2023-02-19 11:35:52,744 - mmseg - INFO - Iter [94900/160000]	lr: 2.441e-05, eta: 5:10:05, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1257, decode.acc_seg: 94.7094, aux.loss_ce: 0.0725, aux.acc_seg: 92.6138, loss: 0.1982, grad_norm: 1.5079
2023-02-19 11:36:06,372 - mmseg - INFO - Iter [94950/160000]	lr: 2.439e-05, eta: 5:09:50, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1249, decode.acc_seg: 94.5067, aux.loss_ce: 0.0721, aux.acc_seg: 92.2778, loss: 0.1970, grad_norm: 1.8449
2023-02-19 11:36:20,086 - mmseg - INFO - Saving checkpoint at 95000 iterations
2023-02-19 11:36:23,346 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:36:23,346 - mmseg - INFO - Iter [95000/160000]	lr: 2.438e-05, eta: 5:09:38, time: 0.340, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1221, decode.acc_seg: 94.6891, aux.loss_ce: 0.0700, aux.acc_seg: 92.6489, loss: 0.1921, grad_norm: 1.9854
2023-02-19 11:36:37,112 - mmseg - INFO - Iter [95050/160000]	lr: 2.436e-05, eta: 5:09:23, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1231, decode.acc_seg: 94.5943, aux.loss_ce: 0.0691, aux.acc_seg: 92.5720, loss: 0.1922, grad_norm: 1.8413
2023-02-19 11:36:51,163 - mmseg - INFO - Iter [95100/160000]	lr: 2.434e-05, eta: 5:09:09, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1161, decode.acc_seg: 95.0137, aux.loss_ce: 0.0685, aux.acc_seg: 92.7402, loss: 0.1846, grad_norm: 1.3407
2023-02-19 11:37:04,815 - mmseg - INFO - Iter [95150/160000]	lr: 2.432e-05, eta: 5:08:54, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1188, decode.acc_seg: 94.8831, aux.loss_ce: 0.0701, aux.acc_seg: 92.5635, loss: 0.1889, grad_norm: 1.6830
2023-02-19 11:37:18,737 - mmseg - INFO - Iter [95200/160000]	lr: 2.430e-05, eta: 5:08:39, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1162, decode.acc_seg: 95.0172, aux.loss_ce: 0.0688, aux.acc_seg: 92.8223, loss: 0.1851, grad_norm: 1.3879
2023-02-19 11:37:32,454 - mmseg - INFO - Iter [95250/160000]	lr: 2.428e-05, eta: 5:08:25, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1253, decode.acc_seg: 94.7140, aux.loss_ce: 0.0729, aux.acc_seg: 92.4914, loss: 0.1982, grad_norm: 1.7220
2023-02-19 11:37:46,050 - mmseg - INFO - Iter [95300/160000]	lr: 2.426e-05, eta: 5:08:10, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1233, decode.acc_seg: 94.7573, aux.loss_ce: 0.0721, aux.acc_seg: 92.4946, loss: 0.1954, grad_norm: 1.9058
2023-02-19 11:37:59,819 - mmseg - INFO - Iter [95350/160000]	lr: 2.424e-05, eta: 5:07:55, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1269, decode.acc_seg: 94.6480, aux.loss_ce: 0.0712, aux.acc_seg: 92.5744, loss: 0.1981, grad_norm: 1.4728
2023-02-19 11:38:14,154 - mmseg - INFO - Iter [95400/160000]	lr: 2.423e-05, eta: 5:07:41, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1246, decode.acc_seg: 94.5968, aux.loss_ce: 0.0721, aux.acc_seg: 92.1059, loss: 0.1967, grad_norm: 2.5691
2023-02-19 11:38:28,292 - mmseg - INFO - Iter [95450/160000]	lr: 2.421e-05, eta: 5:07:27, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1213, decode.acc_seg: 94.6845, aux.loss_ce: 0.0732, aux.acc_seg: 92.1725, loss: 0.1945, grad_norm: 1.7740
2023-02-19 11:38:42,359 - mmseg - INFO - Iter [95500/160000]	lr: 2.419e-05, eta: 5:07:12, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1134, decode.acc_seg: 95.0906, aux.loss_ce: 0.0664, aux.acc_seg: 92.9877, loss: 0.1798, grad_norm: 1.4924
2023-02-19 11:38:56,164 - mmseg - INFO - Iter [95550/160000]	lr: 2.417e-05, eta: 5:06:58, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1236, decode.acc_seg: 94.8008, aux.loss_ce: 0.0721, aux.acc_seg: 92.5741, loss: 0.1957, grad_norm: 1.7357
2023-02-19 11:39:09,919 - mmseg - INFO - Iter [95600/160000]	lr: 2.415e-05, eta: 5:06:43, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1252, decode.acc_seg: 94.5924, aux.loss_ce: 0.0724, aux.acc_seg: 92.3588, loss: 0.1976, grad_norm: 1.6917
2023-02-19 11:39:24,070 - mmseg - INFO - Iter [95650/160000]	lr: 2.413e-05, eta: 5:06:29, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1146, decode.acc_seg: 94.8812, aux.loss_ce: 0.0677, aux.acc_seg: 92.6727, loss: 0.1823, grad_norm: 1.6472
2023-02-19 11:39:37,633 - mmseg - INFO - Iter [95700/160000]	lr: 2.411e-05, eta: 5:06:14, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1176, decode.acc_seg: 94.8312, aux.loss_ce: 0.0684, aux.acc_seg: 92.5986, loss: 0.1860, grad_norm: 1.4830
2023-02-19 11:39:51,356 - mmseg - INFO - Iter [95750/160000]	lr: 2.409e-05, eta: 5:05:59, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1125, decode.acc_seg: 95.1124, aux.loss_ce: 0.0661, aux.acc_seg: 93.0240, loss: 0.1786, grad_norm: 1.6960
2023-02-19 11:40:04,986 - mmseg - INFO - Iter [95800/160000]	lr: 2.408e-05, eta: 5:05:44, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1204, decode.acc_seg: 94.8193, aux.loss_ce: 0.0711, aux.acc_seg: 92.4590, loss: 0.1915, grad_norm: 2.6158
2023-02-19 11:40:18,789 - mmseg - INFO - Iter [95850/160000]	lr: 2.406e-05, eta: 5:05:30, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1235, decode.acc_seg: 94.7254, aux.loss_ce: 0.0727, aux.acc_seg: 92.3414, loss: 0.1962, grad_norm: 1.6246
2023-02-19 11:40:32,393 - mmseg - INFO - Iter [95900/160000]	lr: 2.404e-05, eta: 5:05:15, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1234, decode.acc_seg: 94.6746, aux.loss_ce: 0.0730, aux.acc_seg: 92.2899, loss: 0.1964, grad_norm: 2.0851
2023-02-19 11:40:46,439 - mmseg - INFO - Iter [95950/160000]	lr: 2.402e-05, eta: 5:05:01, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1300, decode.acc_seg: 94.5327, aux.loss_ce: 0.0754, aux.acc_seg: 92.1676, loss: 0.2054, grad_norm: 2.6206
2023-02-19 11:41:02,332 - mmseg - INFO - Saving checkpoint at 96000 iterations
2023-02-19 11:41:05,623 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:41:05,623 - mmseg - INFO - Iter [96000/160000]	lr: 2.400e-05, eta: 5:04:50, time: 0.384, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1267, decode.acc_seg: 94.5281, aux.loss_ce: 0.0742, aux.acc_seg: 92.0753, loss: 0.2009, grad_norm: 1.8941
2023-02-19 11:41:19,912 - mmseg - INFO - per class results:
2023-02-19 11:41:19,918 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        | 79.04 | 89.16 |
|       building      | 82.32 | 91.68 |
|         sky         | 94.45 | 98.02 |
|        floor        | 81.34 | 91.06 |
|         tree        | 75.09 | 86.14 |
|       ceiling       | 85.17 | 93.91 |
|         road        | 84.17 | 92.78 |
|         bed         | 90.56 | 95.93 |
|      windowpane     | 63.14 | 76.62 |
|        grass        | 65.86 | 80.42 |
|       cabinet       | 62.07 | 75.82 |
|       sidewalk      | 67.31 | 78.69 |
|        person       | 81.84 | 93.64 |
|        earth        |  35.1 | 48.44 |
|         door        | 54.86 |  65.7 |
|        table        | 63.97 | 73.92 |
|       mountain      |  59.4 | 73.63 |
|        plant        | 52.48 | 65.47 |
|       curtain       |  76.4 | 88.17 |
|        chair        | 63.58 | 81.57 |
|         car         | 85.51 | 92.34 |
|        water        | 53.87 | 66.87 |
|       painting      | 77.25 | 88.78 |
|         sofa        |  75.0 | 88.92 |
|        shelf        | 44.99 | 64.31 |
|        house        | 41.01 | 57.05 |
|         sea         | 58.11 | 89.32 |
|        mirror       | 73.72 |  83.1 |
|         rug         | 62.27 | 76.73 |
|        field        | 29.05 | 48.75 |
|       armchair      | 52.29 | 65.54 |
|         seat        | 65.21 | 84.05 |
|        fence        | 42.23 | 57.11 |
|         desk        |  52.5 | 67.27 |
|         rock        |  46.0 | 64.44 |
|       wardrobe      | 46.36 | 66.23 |
|         lamp        | 67.84 | 79.41 |
|       bathtub       | 82.09 | 86.03 |
|       railing       | 35.98 |  43.8 |
|       cushion       | 61.75 | 81.38 |
|         base        | 40.47 | 49.68 |
|         box         | 30.43 | 39.85 |
|        column       | 51.62 | 60.56 |
|      signboard      | 39.87 | 57.26 |
|   chest of drawers  | 40.42 | 58.45 |
|       counter       | 28.51 | 38.63 |
|         sand        | 54.75 | 79.42 |
|         sink        | 73.87 | 83.73 |
|      skyscraper     | 50.04 | 62.52 |
|      fireplace      | 75.64 | 90.94 |
|     refrigerator    | 80.82 | 87.91 |
|      grandstand     | 42.01 | 67.29 |
|         path        | 23.66 | 38.87 |
|        stairs       | 27.16 | 32.21 |
|        runway       | 70.74 | 91.83 |
|         case        | 48.11 | 66.39 |
|      pool table     | 93.25 | 96.87 |
|        pillow       | 57.74 | 65.75 |
|     screen door     |  85.7 | 89.81 |
|       stairway      | 30.51 |  39.0 |
|        river        |  10.9 | 19.05 |
|        bridge       |  70.0 | 84.06 |
|       bookcase      |  47.8 | 66.18 |
|        blind        |  40.4 | 43.44 |
|     coffee table    | 63.46 | 81.32 |
|        toilet       | 87.24 | 90.52 |
|        flower       | 45.27 | 59.23 |
|         book        | 46.26 | 70.51 |
|         hill        | 13.06 | 22.42 |
|        bench        | 49.96 |  54.5 |
|      countertop     | 52.89 | 77.83 |
|        stove        | 80.59 | 83.77 |
|         palm        | 57.19 | 79.04 |
|    kitchen island   | 47.72 | 67.21 |
|       computer      | 77.06 | 86.82 |
|     swivel chair    | 41.43 | 56.19 |
|         boat        | 49.58 | 54.72 |
|         bar         | 31.92 | 42.65 |
|    arcade machine   | 46.11 | 48.45 |
|        hovel        | 33.38 |  37.1 |
|         bus         | 89.29 | 96.81 |
|        towel        | 68.73 | 85.07 |
|        light        |  58.2 | 71.59 |
|        truck        | 39.86 | 56.85 |
|        tower        | 34.35 | 58.03 |
|      chandelier     | 72.49 |  83.4 |
|        awning       | 36.74 | 48.02 |
|     streetlight     | 33.64 | 43.53 |
|        booth        |  48.2 | 55.22 |
| television receiver | 73.18 | 78.64 |
|       airplane      | 62.67 | 65.93 |
|      dirt track     |  8.84 | 40.62 |
|       apparel       | 45.52 | 67.04 |
|         pole        | 21.69 | 33.01 |
|         land        |  5.06 |  7.6  |
|      bannister      |  7.05 |  8.56 |
|      escalator      | 47.54 | 63.44 |
|       ottoman       | 44.44 | 58.82 |
|        bottle       | 37.32 | 60.38 |
|        buffet       | 43.17 | 50.34 |
|        poster       | 24.67 |  39.8 |
|        stage        | 13.83 | 20.89 |
|         van         |  38.2 | 50.42 |
|         ship        |  57.6 | 85.29 |
|       fountain      | 26.49 | 27.34 |
|    conveyer belt    | 76.32 | 92.57 |
|        canopy       | 45.95 |  52.3 |
|        washer       | 70.41 | 72.93 |
|      plaything      | 31.24 | 44.12 |
|    swimming pool    | 57.11 | 76.86 |
|        stool        | 43.92 | 48.86 |
|        barrel       | 28.88 |  74.6 |
|        basket       | 40.71 | 62.62 |
|      waterfall      | 49.03 | 58.61 |
|         tent        | 89.85 | 99.05 |
|         bag         | 15.84 | 18.02 |
|       minibike      |  71.7 | 87.92 |
|        cradle       |  85.0 | 94.17 |
|         oven        | 54.56 | 64.28 |
|         ball        | 53.89 | 66.77 |
|         food        | 52.38 | 61.48 |
|         step        |  8.97 | 10.86 |
|         tank        | 57.89 |  58.8 |
|      trade name     | 29.48 | 38.87 |
|      microwave      | 83.85 | 93.21 |
|         pot         |  53.9 | 61.82 |
|        animal       | 61.29 | 65.97 |
|       bicycle       | 57.74 |  76.8 |
|         lake        | 53.32 | 62.33 |
|      dishwasher     | 72.61 | 77.87 |
|        screen       | 56.98 | 68.21 |
|       blanket       | 24.63 | 29.47 |
|      sculpture      | 73.21 | 80.04 |
|         hood        | 66.71 | 70.07 |
|        sconce       | 52.74 | 66.86 |
|         vase        | 43.23 | 59.09 |
|    traffic light    |  30.6 |  57.2 |
|         tray        | 13.29 |  17.4 |
|        ashcan       | 44.98 | 56.09 |
|         fan         | 67.39 | 83.15 |
|         pier        | 30.58 | 47.29 |
|      crt screen     |  4.3  | 10.82 |
|        plate        | 56.56 | 83.31 |
|       monitor       | 23.98 | 31.99 |
|    bulletin board   | 46.36 | 59.21 |
|        shower       | 10.63 | 19.03 |
|       radiator      | 63.89 |  69.3 |
|        glass        | 14.69 |  15.7 |
|        clock        | 45.18 | 50.03 |
|         flag        | 61.52 | 81.69 |
+---------------------+-------+-------+
2023-02-19 11:41:19,918 - mmseg - INFO - Summary:
2023-02-19 11:41:19,918 - mmseg - INFO - 
+-------+-------+-------+
|  aAcc |  mIoU |  mAcc |
+-------+-------+-------+
| 83.95 | 52.16 | 64.32 |
+-------+-------+-------+
2023-02-19 11:41:19,919 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:41:19,919 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8395, mIoU: 0.5216, mAcc: 0.6432, IoU.wall: 0.7904, IoU.building: 0.8232, IoU.sky: 0.9445, IoU.floor: 0.8134, IoU.tree: 0.7509, IoU.ceiling: 0.8517, IoU.road: 0.8417, IoU.bed : 0.9056, IoU.windowpane: 0.6314, IoU.grass: 0.6586, IoU.cabinet: 0.6207, IoU.sidewalk: 0.6731, IoU.person: 0.8184, IoU.earth: 0.3510, IoU.door: 0.5486, IoU.table: 0.6397, IoU.mountain: 0.5940, IoU.plant: 0.5248, IoU.curtain: 0.7640, IoU.chair: 0.6358, IoU.car: 0.8551, IoU.water: 0.5387, IoU.painting: 0.7725, IoU.sofa: 0.7500, IoU.shelf: 0.4499, IoU.house: 0.4101, IoU.sea: 0.5811, IoU.mirror: 0.7372, IoU.rug: 0.6227, IoU.field: 0.2905, IoU.armchair: 0.5229, IoU.seat: 0.6521, IoU.fence: 0.4223, IoU.desk: 0.5250, IoU.rock: 0.4600, IoU.wardrobe: 0.4636, IoU.lamp: 0.6784, IoU.bathtub: 0.8209, IoU.railing: 0.3598, IoU.cushion: 0.6175, IoU.base: 0.4047, IoU.box: 0.3043, IoU.column: 0.5162, IoU.signboard: 0.3987, IoU.chest of drawers: 0.4042, IoU.counter: 0.2851, IoU.sand: 0.5475, IoU.sink: 0.7387, IoU.skyscraper: 0.5004, IoU.fireplace: 0.7564, IoU.refrigerator: 0.8082, IoU.grandstand: 0.4201, IoU.path: 0.2366, IoU.stairs: 0.2716, IoU.runway: 0.7074, IoU.case: 0.4811, IoU.pool table: 0.9325, IoU.pillow: 0.5774, IoU.screen door: 0.8570, IoU.stairway: 0.3051, IoU.river: 0.1090, IoU.bridge: 0.7000, IoU.bookcase: 0.4780, IoU.blind: 0.4040, IoU.coffee table: 0.6346, IoU.toilet: 0.8724, IoU.flower: 0.4527, IoU.book: 0.4626, IoU.hill: 0.1306, IoU.bench: 0.4996, IoU.countertop: 0.5289, IoU.stove: 0.8059, IoU.palm: 0.5719, IoU.kitchen island: 0.4772, IoU.computer: 0.7706, IoU.swivel chair: 0.4143, IoU.boat: 0.4958, IoU.bar: 0.3192, IoU.arcade machine: 0.4611, IoU.hovel: 0.3338, IoU.bus: 0.8929, IoU.towel: 0.6873, IoU.light: 0.5820, IoU.truck: 0.3986, IoU.tower: 0.3435, IoU.chandelier: 0.7249, IoU.awning: 0.3674, IoU.streetlight: 0.3364, IoU.booth: 0.4820, IoU.television receiver: 0.7318, IoU.airplane: 0.6267, IoU.dirt track: 0.0884, IoU.apparel: 0.4552, IoU.pole: 0.2169, IoU.land: 0.0506, IoU.bannister: 0.0705, IoU.escalator: 0.4754, IoU.ottoman: 0.4444, IoU.bottle: 0.3732, IoU.buffet: 0.4317, IoU.poster: 0.2467, IoU.stage: 0.1383, IoU.van: 0.3820, IoU.ship: 0.5760, IoU.fountain: 0.2649, IoU.conveyer belt: 0.7632, IoU.canopy: 0.4595, IoU.washer: 0.7041, IoU.plaything: 0.3124, IoU.swimming pool: 0.5711, IoU.stool: 0.4392, IoU.barrel: 0.2888, IoU.basket: 0.4071, IoU.waterfall: 0.4903, IoU.tent: 0.8985, IoU.bag: 0.1584, IoU.minibike: 0.7170, IoU.cradle: 0.8500, IoU.oven: 0.5456, IoU.ball: 0.5389, IoU.food: 0.5238, IoU.step: 0.0897, IoU.tank: 0.5789, IoU.trade name: 0.2948, IoU.microwave: 0.8385, IoU.pot: 0.5390, IoU.animal: 0.6129, IoU.bicycle: 0.5774, IoU.lake: 0.5332, IoU.dishwasher: 0.7261, IoU.screen: 0.5698, IoU.blanket: 0.2463, IoU.sculpture: 0.7321, IoU.hood: 0.6671, IoU.sconce: 0.5274, IoU.vase: 0.4323, IoU.traffic light: 0.3060, IoU.tray: 0.1329, IoU.ashcan: 0.4498, IoU.fan: 0.6739, IoU.pier: 0.3058, IoU.crt screen: 0.0430, IoU.plate: 0.5656, IoU.monitor: 0.2398, IoU.bulletin board: 0.4636, IoU.shower: 0.1063, IoU.radiator: 0.6389, IoU.glass: 0.1469, IoU.clock: 0.4518, IoU.flag: 0.6152, Acc.wall: 0.8916, Acc.building: 0.9168, Acc.sky: 0.9802, Acc.floor: 0.9106, Acc.tree: 0.8614, Acc.ceiling: 0.9391, Acc.road: 0.9278, Acc.bed : 0.9593, Acc.windowpane: 0.7662, Acc.grass: 0.8042, Acc.cabinet: 0.7582, Acc.sidewalk: 0.7869, Acc.person: 0.9364, Acc.earth: 0.4844, Acc.door: 0.6570, Acc.table: 0.7392, Acc.mountain: 0.7363, Acc.plant: 0.6547, Acc.curtain: 0.8817, Acc.chair: 0.8157, Acc.car: 0.9234, Acc.water: 0.6687, Acc.painting: 0.8878, Acc.sofa: 0.8892, Acc.shelf: 0.6431, Acc.house: 0.5705, Acc.sea: 0.8932, Acc.mirror: 0.8310, Acc.rug: 0.7673, Acc.field: 0.4875, Acc.armchair: 0.6554, Acc.seat: 0.8405, Acc.fence: 0.5711, Acc.desk: 0.6727, Acc.rock: 0.6444, Acc.wardrobe: 0.6623, Acc.lamp: 0.7941, Acc.bathtub: 0.8603, Acc.railing: 0.4380, Acc.cushion: 0.8138, Acc.base: 0.4968, Acc.box: 0.3985, Acc.column: 0.6056, Acc.signboard: 0.5726, Acc.chest of drawers: 0.5845, Acc.counter: 0.3863, Acc.sand: 0.7942, Acc.sink: 0.8373, Acc.skyscraper: 0.6252, Acc.fireplace: 0.9094, Acc.refrigerator: 0.8791, Acc.grandstand: 0.6729, Acc.path: 0.3887, Acc.stairs: 0.3221, Acc.runway: 0.9183, Acc.case: 0.6639, Acc.pool table: 0.9687, Acc.pillow: 0.6575, Acc.screen door: 0.8981, Acc.stairway: 0.3900, Acc.river: 0.1905, Acc.bridge: 0.8406, Acc.bookcase: 0.6618, Acc.blind: 0.4344, Acc.coffee table: 0.8132, Acc.toilet: 0.9052, Acc.flower: 0.5923, Acc.book: 0.7051, Acc.hill: 0.2242, Acc.bench: 0.5450, Acc.countertop: 0.7783, Acc.stove: 0.8377, Acc.palm: 0.7904, Acc.kitchen island: 0.6721, Acc.computer: 0.8682, Acc.swivel chair: 0.5619, Acc.boat: 0.5472, Acc.bar: 0.4265, Acc.arcade machine: 0.4845, Acc.hovel: 0.3710, Acc.bus: 0.9681, Acc.towel: 0.8507, Acc.light: 0.7159, Acc.truck: 0.5685, Acc.tower: 0.5803, Acc.chandelier: 0.8340, Acc.awning: 0.4802, Acc.streetlight: 0.4353, Acc.booth: 0.5522, Acc.television receiver: 0.7864, Acc.airplane: 0.6593, Acc.dirt track: 0.4062, Acc.apparel: 0.6704, Acc.pole: 0.3301, Acc.land: 0.0760, Acc.bannister: 0.0856, Acc.escalator: 0.6344, Acc.ottoman: 0.5882, Acc.bottle: 0.6038, Acc.buffet: 0.5034, Acc.poster: 0.3980, Acc.stage: 0.2089, Acc.van: 0.5042, Acc.ship: 0.8529, Acc.fountain: 0.2734, Acc.conveyer belt: 0.9257, Acc.canopy: 0.5230, Acc.washer: 0.7293, Acc.plaything: 0.4412, Acc.swimming pool: 0.7686, Acc.stool: 0.4886, Acc.barrel: 0.7460, Acc.basket: 0.6262, Acc.waterfall: 0.5861, Acc.tent: 0.9905, Acc.bag: 0.1802, Acc.minibike: 0.8792, Acc.cradle: 0.9417, Acc.oven: 0.6428, Acc.ball: 0.6677, Acc.food: 0.6148, Acc.step: 0.1086, Acc.tank: 0.5880, Acc.trade name: 0.3887, Acc.microwave: 0.9321, Acc.pot: 0.6182, Acc.animal: 0.6597, Acc.bicycle: 0.7680, Acc.lake: 0.6233, Acc.dishwasher: 0.7787, Acc.screen: 0.6821, Acc.blanket: 0.2947, Acc.sculpture: 0.8004, Acc.hood: 0.7007, Acc.sconce: 0.6686, Acc.vase: 0.5909, Acc.traffic light: 0.5720, Acc.tray: 0.1740, Acc.ashcan: 0.5609, Acc.fan: 0.8315, Acc.pier: 0.4729, Acc.crt screen: 0.1082, Acc.plate: 0.8331, Acc.monitor: 0.3199, Acc.bulletin board: 0.5921, Acc.shower: 0.1903, Acc.radiator: 0.6930, Acc.glass: 0.1570, Acc.clock: 0.5003, Acc.flag: 0.8169
2023-02-19 11:41:33,908 - mmseg - INFO - Iter [96050/160000]	lr: 2.398e-05, eta: 5:04:45, time: 0.566, data_time: 0.291, memory: 15214, decode.loss_ce: 0.1200, decode.acc_seg: 94.7374, aux.loss_ce: 0.0696, aux.acc_seg: 92.5142, loss: 0.1896, grad_norm: 1.8708
2023-02-19 11:41:47,494 - mmseg - INFO - Iter [96100/160000]	lr: 2.396e-05, eta: 5:04:30, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1148, decode.acc_seg: 95.0847, aux.loss_ce: 0.0689, aux.acc_seg: 92.7262, loss: 0.1837, grad_norm: 1.5432
2023-02-19 11:42:01,448 - mmseg - INFO - Iter [96150/160000]	lr: 2.394e-05, eta: 5:04:15, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1224, decode.acc_seg: 94.7504, aux.loss_ce: 0.0702, aux.acc_seg: 92.5954, loss: 0.1926, grad_norm: 2.6299
2023-02-19 11:42:15,856 - mmseg - INFO - Iter [96200/160000]	lr: 2.393e-05, eta: 5:04:01, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1205, decode.acc_seg: 94.8693, aux.loss_ce: 0.0680, aux.acc_seg: 92.8132, loss: 0.1884, grad_norm: 1.4554
2023-02-19 11:42:29,433 - mmseg - INFO - Iter [96250/160000]	lr: 2.391e-05, eta: 5:03:46, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1250, decode.acc_seg: 94.7798, aux.loss_ce: 0.0742, aux.acc_seg: 92.2660, loss: 0.1992, grad_norm: 1.4991
2023-02-19 11:42:43,239 - mmseg - INFO - Iter [96300/160000]	lr: 2.389e-05, eta: 5:03:32, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1163, decode.acc_seg: 94.9982, aux.loss_ce: 0.0693, aux.acc_seg: 92.5952, loss: 0.1856, grad_norm: 1.6213
2023-02-19 11:42:57,056 - mmseg - INFO - Iter [96350/160000]	lr: 2.387e-05, eta: 5:03:17, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1191, decode.acc_seg: 94.9210, aux.loss_ce: 0.0694, aux.acc_seg: 92.7479, loss: 0.1885, grad_norm: 1.8460
2023-02-19 11:43:11,131 - mmseg - INFO - Iter [96400/160000]	lr: 2.385e-05, eta: 5:03:03, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1136, decode.acc_seg: 95.0420, aux.loss_ce: 0.0676, aux.acc_seg: 92.6830, loss: 0.1812, grad_norm: 1.6839
2023-02-19 11:43:25,331 - mmseg - INFO - Iter [96450/160000]	lr: 2.383e-05, eta: 5:02:48, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1189, decode.acc_seg: 94.7629, aux.loss_ce: 0.0713, aux.acc_seg: 92.4857, loss: 0.1902, grad_norm: 1.7568
2023-02-19 11:43:38,976 - mmseg - INFO - Iter [96500/160000]	lr: 2.381e-05, eta: 5:02:34, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1238, decode.acc_seg: 94.5096, aux.loss_ce: 0.0727, aux.acc_seg: 92.1585, loss: 0.1965, grad_norm: 1.8085
2023-02-19 11:43:53,473 - mmseg - INFO - Iter [96550/160000]	lr: 2.379e-05, eta: 5:02:19, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1270, decode.acc_seg: 94.5353, aux.loss_ce: 0.0752, aux.acc_seg: 92.1484, loss: 0.2022, grad_norm: 1.7973
2023-02-19 11:44:08,299 - mmseg - INFO - Iter [96600/160000]	lr: 2.378e-05, eta: 5:02:06, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1189, decode.acc_seg: 94.8537, aux.loss_ce: 0.0697, aux.acc_seg: 92.5798, loss: 0.1886, grad_norm: 1.7000
2023-02-19 11:44:21,967 - mmseg - INFO - Iter [96650/160000]	lr: 2.376e-05, eta: 5:01:51, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1167, decode.acc_seg: 94.8638, aux.loss_ce: 0.0683, aux.acc_seg: 92.6739, loss: 0.1851, grad_norm: 1.4579
2023-02-19 11:44:36,473 - mmseg - INFO - Iter [96700/160000]	lr: 2.374e-05, eta: 5:01:37, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1235, decode.acc_seg: 94.7766, aux.loss_ce: 0.0739, aux.acc_seg: 92.4359, loss: 0.1974, grad_norm: 1.8712
2023-02-19 11:44:50,590 - mmseg - INFO - Iter [96750/160000]	lr: 2.372e-05, eta: 5:01:22, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1167, decode.acc_seg: 94.9405, aux.loss_ce: 0.0671, aux.acc_seg: 92.8499, loss: 0.1838, grad_norm: 1.4152
2023-02-19 11:45:05,027 - mmseg - INFO - Iter [96800/160000]	lr: 2.370e-05, eta: 5:01:08, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1180, decode.acc_seg: 94.8292, aux.loss_ce: 0.0682, aux.acc_seg: 92.6789, loss: 0.1861, grad_norm: 1.7242
2023-02-19 11:45:19,074 - mmseg - INFO - Iter [96850/160000]	lr: 2.368e-05, eta: 5:00:54, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1241, decode.acc_seg: 94.6975, aux.loss_ce: 0.0737, aux.acc_seg: 92.2327, loss: 0.1978, grad_norm: 1.8206
2023-02-19 11:45:32,783 - mmseg - INFO - Iter [96900/160000]	lr: 2.366e-05, eta: 5:00:39, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1167, decode.acc_seg: 94.9956, aux.loss_ce: 0.0692, aux.acc_seg: 92.7418, loss: 0.1859, grad_norm: 1.4096
2023-02-19 11:45:47,166 - mmseg - INFO - Iter [96950/160000]	lr: 2.364e-05, eta: 5:00:25, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1195, decode.acc_seg: 95.0299, aux.loss_ce: 0.0693, aux.acc_seg: 92.7931, loss: 0.1888, grad_norm: 1.8822
2023-02-19 11:46:01,087 - mmseg - INFO - Saving checkpoint at 97000 iterations
2023-02-19 11:46:04,385 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:46:04,385 - mmseg - INFO - Iter [97000/160000]	lr: 2.363e-05, eta: 5:00:12, time: 0.345, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1222, decode.acc_seg: 94.7320, aux.loss_ce: 0.0739, aux.acc_seg: 92.1835, loss: 0.1961, grad_norm: 1.5571
2023-02-19 11:46:18,374 - mmseg - INFO - Iter [97050/160000]	lr: 2.361e-05, eta: 4:59:58, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1275, decode.acc_seg: 94.4409, aux.loss_ce: 0.0743, aux.acc_seg: 92.0623, loss: 0.2019, grad_norm: 2.2225
2023-02-19 11:46:32,757 - mmseg - INFO - Iter [97100/160000]	lr: 2.359e-05, eta: 4:59:44, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1172, decode.acc_seg: 94.9068, aux.loss_ce: 0.0683, aux.acc_seg: 92.6673, loss: 0.1854, grad_norm: 1.5766
2023-02-19 11:46:46,434 - mmseg - INFO - Iter [97150/160000]	lr: 2.357e-05, eta: 4:59:29, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1261, decode.acc_seg: 94.6786, aux.loss_ce: 0.0737, aux.acc_seg: 92.2956, loss: 0.1998, grad_norm: 1.8433
2023-02-19 11:47:00,068 - mmseg - INFO - Iter [97200/160000]	lr: 2.355e-05, eta: 4:59:14, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1204, decode.acc_seg: 94.8437, aux.loss_ce: 0.0712, aux.acc_seg: 92.5991, loss: 0.1917, grad_norm: 1.6323
2023-02-19 11:47:13,711 - mmseg - INFO - Iter [97250/160000]	lr: 2.353e-05, eta: 4:58:59, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1198, decode.acc_seg: 94.8022, aux.loss_ce: 0.0719, aux.acc_seg: 92.4093, loss: 0.1917, grad_norm: 1.4800
2023-02-19 11:47:29,846 - mmseg - INFO - Iter [97300/160000]	lr: 2.351e-05, eta: 4:58:46, time: 0.323, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1202, decode.acc_seg: 94.9919, aux.loss_ce: 0.0703, aux.acc_seg: 92.8321, loss: 0.1905, grad_norm: 1.6336
2023-02-19 11:47:44,162 - mmseg - INFO - Iter [97350/160000]	lr: 2.349e-05, eta: 4:58:32, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1269, decode.acc_seg: 94.5228, aux.loss_ce: 0.0746, aux.acc_seg: 92.1025, loss: 0.2015, grad_norm: 1.5982
2023-02-19 11:47:58,878 - mmseg - INFO - Iter [97400/160000]	lr: 2.348e-05, eta: 4:58:18, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1152, decode.acc_seg: 94.9947, aux.loss_ce: 0.0684, aux.acc_seg: 92.6797, loss: 0.1836, grad_norm: 1.6231
2023-02-19 11:48:13,252 - mmseg - INFO - Iter [97450/160000]	lr: 2.346e-05, eta: 4:58:04, time: 0.288, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1189, decode.acc_seg: 94.8183, aux.loss_ce: 0.0704, aux.acc_seg: 92.5054, loss: 0.1892, grad_norm: 1.5058
2023-02-19 11:48:27,114 - mmseg - INFO - Iter [97500/160000]	lr: 2.344e-05, eta: 4:57:49, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1215, decode.acc_seg: 94.6989, aux.loss_ce: 0.0704, aux.acc_seg: 92.4761, loss: 0.1918, grad_norm: 1.6276
2023-02-19 11:48:41,482 - mmseg - INFO - Iter [97550/160000]	lr: 2.342e-05, eta: 4:57:35, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1214, decode.acc_seg: 94.8467, aux.loss_ce: 0.0685, aux.acc_seg: 92.7687, loss: 0.1899, grad_norm: 1.4287
2023-02-19 11:48:55,422 - mmseg - INFO - Iter [97600/160000]	lr: 2.340e-05, eta: 4:57:20, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1231, decode.acc_seg: 94.6586, aux.loss_ce: 0.0701, aux.acc_seg: 92.5507, loss: 0.1931, grad_norm: 1.8624
2023-02-19 11:49:09,110 - mmseg - INFO - Iter [97650/160000]	lr: 2.338e-05, eta: 4:57:06, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1215, decode.acc_seg: 94.6744, aux.loss_ce: 0.0725, aux.acc_seg: 92.2292, loss: 0.1940, grad_norm: 2.4574
2023-02-19 11:49:22,859 - mmseg - INFO - Iter [97700/160000]	lr: 2.336e-05, eta: 4:56:51, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1165, decode.acc_seg: 94.9561, aux.loss_ce: 0.0687, aux.acc_seg: 92.6651, loss: 0.1852, grad_norm: 1.4079
2023-02-19 11:49:36,650 - mmseg - INFO - Iter [97750/160000]	lr: 2.334e-05, eta: 4:56:36, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1166, decode.acc_seg: 94.9154, aux.loss_ce: 0.0676, aux.acc_seg: 92.7169, loss: 0.1842, grad_norm: 1.4065
2023-02-19 11:49:50,918 - mmseg - INFO - Iter [97800/160000]	lr: 2.333e-05, eta: 4:56:22, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1194, decode.acc_seg: 94.8168, aux.loss_ce: 0.0701, aux.acc_seg: 92.5495, loss: 0.1896, grad_norm: 1.6273
2023-02-19 11:50:04,823 - mmseg - INFO - Iter [97850/160000]	lr: 2.331e-05, eta: 4:56:08, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1224, decode.acc_seg: 95.0056, aux.loss_ce: 0.0714, aux.acc_seg: 92.8336, loss: 0.1937, grad_norm: 1.7886
2023-02-19 11:50:18,442 - mmseg - INFO - Iter [97900/160000]	lr: 2.329e-05, eta: 4:55:53, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1154, decode.acc_seg: 95.2198, aux.loss_ce: 0.0683, aux.acc_seg: 93.0421, loss: 0.1836, grad_norm: 1.6635
2023-02-19 11:50:32,259 - mmseg - INFO - Iter [97950/160000]	lr: 2.327e-05, eta: 4:55:38, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1206, decode.acc_seg: 94.6559, aux.loss_ce: 0.0731, aux.acc_seg: 92.2868, loss: 0.1937, grad_norm: 1.9424
2023-02-19 11:50:46,361 - mmseg - INFO - Saving checkpoint at 98000 iterations
2023-02-19 11:50:49,767 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:50:49,768 - mmseg - INFO - Iter [98000/160000]	lr: 2.325e-05, eta: 4:55:26, time: 0.351, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1248, decode.acc_seg: 94.6289, aux.loss_ce: 0.0722, aux.acc_seg: 92.3380, loss: 0.1970, grad_norm: 1.6777
2023-02-19 11:51:03,537 - mmseg - INFO - Iter [98050/160000]	lr: 2.323e-05, eta: 4:55:11, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1145, decode.acc_seg: 94.9716, aux.loss_ce: 0.0676, aux.acc_seg: 92.6824, loss: 0.1821, grad_norm: 1.7259
2023-02-19 11:51:18,311 - mmseg - INFO - Iter [98100/160000]	lr: 2.321e-05, eta: 4:54:57, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1168, decode.acc_seg: 94.9732, aux.loss_ce: 0.0672, aux.acc_seg: 92.8877, loss: 0.1840, grad_norm: 2.1801
2023-02-19 11:51:31,891 - mmseg - INFO - Iter [98150/160000]	lr: 2.319e-05, eta: 4:54:43, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1220, decode.acc_seg: 94.7773, aux.loss_ce: 0.0718, aux.acc_seg: 92.4787, loss: 0.1938, grad_norm: 1.8676
2023-02-19 11:51:45,961 - mmseg - INFO - Iter [98200/160000]	lr: 2.318e-05, eta: 4:54:28, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1179, decode.acc_seg: 94.8720, aux.loss_ce: 0.0697, aux.acc_seg: 92.5706, loss: 0.1877, grad_norm: 1.8234
2023-02-19 11:51:59,888 - mmseg - INFO - Iter [98250/160000]	lr: 2.316e-05, eta: 4:54:14, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1131, decode.acc_seg: 95.0472, aux.loss_ce: 0.0678, aux.acc_seg: 92.7723, loss: 0.1809, grad_norm: 1.4185
2023-02-19 11:52:14,327 - mmseg - INFO - Iter [98300/160000]	lr: 2.314e-05, eta: 4:53:59, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1265, decode.acc_seg: 94.6667, aux.loss_ce: 0.0735, aux.acc_seg: 92.5257, loss: 0.2000, grad_norm: 1.6701
2023-02-19 11:52:28,624 - mmseg - INFO - Iter [98350/160000]	lr: 2.312e-05, eta: 4:53:45, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1116, decode.acc_seg: 95.0375, aux.loss_ce: 0.0656, aux.acc_seg: 92.9000, loss: 0.1772, grad_norm: 1.4712
2023-02-19 11:52:42,196 - mmseg - INFO - Iter [98400/160000]	lr: 2.310e-05, eta: 4:53:30, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1136, decode.acc_seg: 94.9948, aux.loss_ce: 0.0682, aux.acc_seg: 92.6055, loss: 0.1818, grad_norm: 1.6699
2023-02-19 11:52:56,083 - mmseg - INFO - Iter [98450/160000]	lr: 2.308e-05, eta: 4:53:16, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1235, decode.acc_seg: 94.6285, aux.loss_ce: 0.0723, aux.acc_seg: 92.4022, loss: 0.1958, grad_norm: 1.9541
2023-02-19 11:53:09,967 - mmseg - INFO - Iter [98500/160000]	lr: 2.306e-05, eta: 4:53:01, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1189, decode.acc_seg: 94.8716, aux.loss_ce: 0.0693, aux.acc_seg: 92.6509, loss: 0.1882, grad_norm: 1.6443
2023-02-19 11:53:25,855 - mmseg - INFO - Iter [98550/160000]	lr: 2.304e-05, eta: 4:52:48, time: 0.317, data_time: 0.049, memory: 15214, decode.loss_ce: 0.1118, decode.acc_seg: 95.1298, aux.loss_ce: 0.0654, aux.acc_seg: 92.9854, loss: 0.1772, grad_norm: 1.4466
2023-02-19 11:53:39,522 - mmseg - INFO - Iter [98600/160000]	lr: 2.303e-05, eta: 4:52:33, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1183, decode.acc_seg: 94.8886, aux.loss_ce: 0.0698, aux.acc_seg: 92.5916, loss: 0.1881, grad_norm: 1.7347
2023-02-19 11:53:53,125 - mmseg - INFO - Iter [98650/160000]	lr: 2.301e-05, eta: 4:52:19, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1210, decode.acc_seg: 94.8159, aux.loss_ce: 0.0684, aux.acc_seg: 92.8088, loss: 0.1894, grad_norm: 1.4395
2023-02-19 11:54:07,306 - mmseg - INFO - Iter [98700/160000]	lr: 2.299e-05, eta: 4:52:04, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1097, decode.acc_seg: 95.3275, aux.loss_ce: 0.0661, aux.acc_seg: 93.0915, loss: 0.1758, grad_norm: 1.7745
2023-02-19 11:54:21,321 - mmseg - INFO - Iter [98750/160000]	lr: 2.297e-05, eta: 4:51:50, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1191, decode.acc_seg: 94.8686, aux.loss_ce: 0.0693, aux.acc_seg: 92.6760, loss: 0.1884, grad_norm: 1.8099
2023-02-19 11:54:35,516 - mmseg - INFO - Iter [98800/160000]	lr: 2.295e-05, eta: 4:51:35, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1164, decode.acc_seg: 95.0463, aux.loss_ce: 0.0686, aux.acc_seg: 92.8064, loss: 0.1850, grad_norm: 1.8064
2023-02-19 11:54:49,899 - mmseg - INFO - Iter [98850/160000]	lr: 2.293e-05, eta: 4:51:21, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1212, decode.acc_seg: 94.7378, aux.loss_ce: 0.0712, aux.acc_seg: 92.4675, loss: 0.1924, grad_norm: 1.6526
2023-02-19 11:55:04,374 - mmseg - INFO - Iter [98900/160000]	lr: 2.291e-05, eta: 4:51:07, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1158, decode.acc_seg: 94.9329, aux.loss_ce: 0.0677, aux.acc_seg: 92.7102, loss: 0.1836, grad_norm: 1.8884
2023-02-19 11:55:18,401 - mmseg - INFO - Iter [98950/160000]	lr: 2.289e-05, eta: 4:50:53, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1124, decode.acc_seg: 94.9938, aux.loss_ce: 0.0662, aux.acc_seg: 92.7517, loss: 0.1787, grad_norm: 1.5192
2023-02-19 11:55:32,415 - mmseg - INFO - Saving checkpoint at 99000 iterations
2023-02-19 11:55:35,660 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 11:55:35,660 - mmseg - INFO - Iter [99000/160000]	lr: 2.288e-05, eta: 4:50:40, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1168, decode.acc_seg: 95.0093, aux.loss_ce: 0.0695, aux.acc_seg: 92.6957, loss: 0.1863, grad_norm: 1.5148
2023-02-19 11:55:49,548 - mmseg - INFO - Iter [99050/160000]	lr: 2.286e-05, eta: 4:50:26, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1098, decode.acc_seg: 95.2342, aux.loss_ce: 0.0658, aux.acc_seg: 92.9919, loss: 0.1757, grad_norm: 1.3298
2023-02-19 11:56:03,740 - mmseg - INFO - Iter [99100/160000]	lr: 2.284e-05, eta: 4:50:11, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1224, decode.acc_seg: 94.7837, aux.loss_ce: 0.0697, aux.acc_seg: 92.6648, loss: 0.1921, grad_norm: 1.9422
2023-02-19 11:56:17,789 - mmseg - INFO - Iter [99150/160000]	lr: 2.282e-05, eta: 4:49:57, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1158, decode.acc_seg: 94.9962, aux.loss_ce: 0.0665, aux.acc_seg: 92.9380, loss: 0.1823, grad_norm: 1.3292
2023-02-19 11:56:32,162 - mmseg - INFO - Iter [99200/160000]	lr: 2.280e-05, eta: 4:49:43, time: 0.287, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1203, decode.acc_seg: 94.8965, aux.loss_ce: 0.0725, aux.acc_seg: 92.4358, loss: 0.1927, grad_norm: 1.5623
2023-02-19 11:56:46,431 - mmseg - INFO - Iter [99250/160000]	lr: 2.278e-05, eta: 4:49:28, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1168, decode.acc_seg: 94.9253, aux.loss_ce: 0.0674, aux.acc_seg: 92.8051, loss: 0.1842, grad_norm: 4.3105
2023-02-19 11:57:01,245 - mmseg - INFO - Iter [99300/160000]	lr: 2.276e-05, eta: 4:49:14, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1240, decode.acc_seg: 94.6429, aux.loss_ce: 0.0700, aux.acc_seg: 92.5129, loss: 0.1940, grad_norm: 1.3384
2023-02-19 11:57:15,468 - mmseg - INFO - Iter [99350/160000]	lr: 2.274e-05, eta: 4:49:00, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1115, decode.acc_seg: 94.9820, aux.loss_ce: 0.0654, aux.acc_seg: 92.8192, loss: 0.1768, grad_norm: 1.4601
2023-02-19 11:57:29,212 - mmseg - INFO - Iter [99400/160000]	lr: 2.273e-05, eta: 4:48:45, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1193, decode.acc_seg: 94.9511, aux.loss_ce: 0.0699, aux.acc_seg: 92.7100, loss: 0.1891, grad_norm: 1.5746
2023-02-19 11:57:42,771 - mmseg - INFO - Iter [99450/160000]	lr: 2.271e-05, eta: 4:48:31, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1173, decode.acc_seg: 94.8757, aux.loss_ce: 0.0678, aux.acc_seg: 92.7316, loss: 0.1851, grad_norm: 1.4159
2023-02-19 11:57:57,120 - mmseg - INFO - Iter [99500/160000]	lr: 2.269e-05, eta: 4:48:16, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1167, decode.acc_seg: 94.9206, aux.loss_ce: 0.0677, aux.acc_seg: 92.7785, loss: 0.1844, grad_norm: 1.8011
2023-02-19 11:58:10,668 - mmseg - INFO - Iter [99550/160000]	lr: 2.267e-05, eta: 4:48:02, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1217, decode.acc_seg: 94.6552, aux.loss_ce: 0.0687, aux.acc_seg: 92.6259, loss: 0.1905, grad_norm: 1.3886
2023-02-19 11:58:24,949 - mmseg - INFO - Iter [99600/160000]	lr: 2.265e-05, eta: 4:47:47, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1196, decode.acc_seg: 94.7658, aux.loss_ce: 0.0694, aux.acc_seg: 92.6435, loss: 0.1889, grad_norm: 1.5568
2023-02-19 11:58:38,814 - mmseg - INFO - Iter [99650/160000]	lr: 2.263e-05, eta: 4:47:33, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1167, decode.acc_seg: 95.0627, aux.loss_ce: 0.0699, aux.acc_seg: 92.7127, loss: 0.1866, grad_norm: 1.8619
2023-02-19 11:58:52,545 - mmseg - INFO - Iter [99700/160000]	lr: 2.261e-05, eta: 4:47:18, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1197, decode.acc_seg: 94.7785, aux.loss_ce: 0.0704, aux.acc_seg: 92.4312, loss: 0.1901, grad_norm: 1.5243
2023-02-19 11:59:06,556 - mmseg - INFO - Iter [99750/160000]	lr: 2.259e-05, eta: 4:47:04, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1199, decode.acc_seg: 94.8595, aux.loss_ce: 0.0719, aux.acc_seg: 92.3651, loss: 0.1918, grad_norm: 1.7082
2023-02-19 11:59:22,418 - mmseg - INFO - Iter [99800/160000]	lr: 2.258e-05, eta: 4:46:50, time: 0.317, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1096, decode.acc_seg: 95.2091, aux.loss_ce: 0.0654, aux.acc_seg: 93.0055, loss: 0.1750, grad_norm: 1.5454
2023-02-19 11:59:36,747 - mmseg - INFO - Iter [99850/160000]	lr: 2.256e-05, eta: 4:46:36, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1111, decode.acc_seg: 95.1382, aux.loss_ce: 0.0644, aux.acc_seg: 93.0670, loss: 0.1755, grad_norm: 1.3518
2023-02-19 11:59:50,774 - mmseg - INFO - Iter [99900/160000]	lr: 2.254e-05, eta: 4:46:21, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1203, decode.acc_seg: 94.9781, aux.loss_ce: 0.0697, aux.acc_seg: 92.9332, loss: 0.1900, grad_norm: 1.3851
2023-02-19 12:00:04,557 - mmseg - INFO - Iter [99950/160000]	lr: 2.252e-05, eta: 4:46:07, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1145, decode.acc_seg: 95.0959, aux.loss_ce: 0.0688, aux.acc_seg: 92.7021, loss: 0.1833, grad_norm: 1.5233
2023-02-19 12:00:18,582 - mmseg - INFO - Saving checkpoint at 100000 iterations
2023-02-19 12:00:21,805 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:00:21,806 - mmseg - INFO - Iter [100000/160000]	lr: 2.250e-05, eta: 4:45:54, time: 0.345, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1109, decode.acc_seg: 95.2882, aux.loss_ce: 0.0665, aux.acc_seg: 93.0550, loss: 0.1774, grad_norm: 1.4628
2023-02-19 12:00:35,563 - mmseg - INFO - Iter [100050/160000]	lr: 2.248e-05, eta: 4:45:40, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1185, decode.acc_seg: 94.8890, aux.loss_ce: 0.0709, aux.acc_seg: 92.5434, loss: 0.1895, grad_norm: 1.6865
2023-02-19 12:00:49,610 - mmseg - INFO - Iter [100100/160000]	lr: 2.246e-05, eta: 4:45:25, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1200, decode.acc_seg: 94.7564, aux.loss_ce: 0.0709, aux.acc_seg: 92.3674, loss: 0.1909, grad_norm: 1.7800
2023-02-19 12:01:03,530 - mmseg - INFO - Iter [100150/160000]	lr: 2.244e-05, eta: 4:45:11, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1233, decode.acc_seg: 94.5120, aux.loss_ce: 0.0723, aux.acc_seg: 92.2168, loss: 0.1956, grad_norm: 1.7129
2023-02-19 12:01:17,173 - mmseg - INFO - Iter [100200/160000]	lr: 2.243e-05, eta: 4:44:56, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1122, decode.acc_seg: 95.0535, aux.loss_ce: 0.0655, aux.acc_seg: 92.8914, loss: 0.1778, grad_norm: 1.7537
2023-02-19 12:01:31,052 - mmseg - INFO - Iter [100250/160000]	lr: 2.241e-05, eta: 4:44:42, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1135, decode.acc_seg: 95.0519, aux.loss_ce: 0.0681, aux.acc_seg: 92.6618, loss: 0.1816, grad_norm: 1.8507
2023-02-19 12:01:45,110 - mmseg - INFO - Iter [100300/160000]	lr: 2.239e-05, eta: 4:44:27, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1124, decode.acc_seg: 95.1348, aux.loss_ce: 0.0663, aux.acc_seg: 92.8886, loss: 0.1788, grad_norm: 2.1427
2023-02-19 12:02:00,227 - mmseg - INFO - Iter [100350/160000]	lr: 2.237e-05, eta: 4:44:13, time: 0.302, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1168, decode.acc_seg: 94.8139, aux.loss_ce: 0.0684, aux.acc_seg: 92.5988, loss: 0.1852, grad_norm: 1.4902
2023-02-19 12:02:13,971 - mmseg - INFO - Iter [100400/160000]	lr: 2.235e-05, eta: 4:43:59, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1136, decode.acc_seg: 94.9690, aux.loss_ce: 0.0686, aux.acc_seg: 92.6009, loss: 0.1821, grad_norm: 1.5446
2023-02-19 12:02:27,716 - mmseg - INFO - Iter [100450/160000]	lr: 2.233e-05, eta: 4:43:44, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1123, decode.acc_seg: 95.2036, aux.loss_ce: 0.0671, aux.acc_seg: 93.0162, loss: 0.1794, grad_norm: 1.7772
2023-02-19 12:02:41,441 - mmseg - INFO - Iter [100500/160000]	lr: 2.231e-05, eta: 4:43:29, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1167, decode.acc_seg: 94.8743, aux.loss_ce: 0.0729, aux.acc_seg: 92.2692, loss: 0.1895, grad_norm: 1.7444
2023-02-19 12:02:55,012 - mmseg - INFO - Iter [100550/160000]	lr: 2.229e-05, eta: 4:43:15, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1137, decode.acc_seg: 95.1385, aux.loss_ce: 0.0678, aux.acc_seg: 92.8267, loss: 0.1815, grad_norm: 1.3609
2023-02-19 12:03:10,219 - mmseg - INFO - Iter [100600/160000]	lr: 2.228e-05, eta: 4:43:01, time: 0.303, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1196, decode.acc_seg: 94.7273, aux.loss_ce: 0.0692, aux.acc_seg: 92.4222, loss: 0.1888, grad_norm: 1.4461
2023-02-19 12:03:24,517 - mmseg - INFO - Iter [100650/160000]	lr: 2.226e-05, eta: 4:42:47, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1138, decode.acc_seg: 95.0534, aux.loss_ce: 0.0677, aux.acc_seg: 92.7513, loss: 0.1815, grad_norm: 1.4785
2023-02-19 12:03:38,280 - mmseg - INFO - Iter [100700/160000]	lr: 2.224e-05, eta: 4:42:32, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1165, decode.acc_seg: 94.8486, aux.loss_ce: 0.0676, aux.acc_seg: 92.6965, loss: 0.1841, grad_norm: 1.4181
2023-02-19 12:03:52,051 - mmseg - INFO - Iter [100750/160000]	lr: 2.222e-05, eta: 4:42:17, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1233, decode.acc_seg: 94.7500, aux.loss_ce: 0.0733, aux.acc_seg: 92.2713, loss: 0.1966, grad_norm: 1.4628
2023-02-19 12:04:05,682 - mmseg - INFO - Iter [100800/160000]	lr: 2.220e-05, eta: 4:42:03, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1202, decode.acc_seg: 94.7917, aux.loss_ce: 0.0710, aux.acc_seg: 92.5110, loss: 0.1912, grad_norm: 2.2878
2023-02-19 12:04:19,769 - mmseg - INFO - Iter [100850/160000]	lr: 2.218e-05, eta: 4:41:48, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1110, decode.acc_seg: 95.0668, aux.loss_ce: 0.0652, aux.acc_seg: 92.8098, loss: 0.1762, grad_norm: 1.4437
2023-02-19 12:04:34,378 - mmseg - INFO - Iter [100900/160000]	lr: 2.216e-05, eta: 4:41:34, time: 0.292, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1131, decode.acc_seg: 94.9301, aux.loss_ce: 0.0660, aux.acc_seg: 92.7191, loss: 0.1791, grad_norm: 1.6652
2023-02-19 12:04:47,912 - mmseg - INFO - Iter [100950/160000]	lr: 2.214e-05, eta: 4:41:20, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1150, decode.acc_seg: 94.9893, aux.loss_ce: 0.0673, aux.acc_seg: 92.8108, loss: 0.1823, grad_norm: 1.3792
2023-02-19 12:05:01,540 - mmseg - INFO - Saving checkpoint at 101000 iterations
2023-02-19 12:05:04,758 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:05:04,759 - mmseg - INFO - Iter [101000/160000]	lr: 2.213e-05, eta: 4:41:07, time: 0.337, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1250, decode.acc_seg: 94.5719, aux.loss_ce: 0.0713, aux.acc_seg: 92.5361, loss: 0.1963, grad_norm: 1.3817
2023-02-19 12:05:21,009 - mmseg - INFO - Iter [101050/160000]	lr: 2.211e-05, eta: 4:40:54, time: 0.325, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1163, decode.acc_seg: 94.9409, aux.loss_ce: 0.0648, aux.acc_seg: 93.0552, loss: 0.1811, grad_norm: 1.5572
2023-02-19 12:05:34,702 - mmseg - INFO - Iter [101100/160000]	lr: 2.209e-05, eta: 4:40:39, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1184, decode.acc_seg: 94.9002, aux.loss_ce: 0.0704, aux.acc_seg: 92.6809, loss: 0.1888, grad_norm: 1.8267
2023-02-19 12:05:48,249 - mmseg - INFO - Iter [101150/160000]	lr: 2.207e-05, eta: 4:40:24, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1138, decode.acc_seg: 94.9924, aux.loss_ce: 0.0675, aux.acc_seg: 92.7932, loss: 0.1813, grad_norm: 1.5091
2023-02-19 12:06:01,961 - mmseg - INFO - Iter [101200/160000]	lr: 2.205e-05, eta: 4:40:10, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1210, decode.acc_seg: 94.7155, aux.loss_ce: 0.0704, aux.acc_seg: 92.5109, loss: 0.1914, grad_norm: 1.5220
2023-02-19 12:06:16,072 - mmseg - INFO - Iter [101250/160000]	lr: 2.203e-05, eta: 4:39:55, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1150, decode.acc_seg: 94.9553, aux.loss_ce: 0.0655, aux.acc_seg: 93.0445, loss: 0.1805, grad_norm: 1.6919
2023-02-19 12:06:29,989 - mmseg - INFO - Iter [101300/160000]	lr: 2.201e-05, eta: 4:39:41, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1206, decode.acc_seg: 94.7619, aux.loss_ce: 0.0715, aux.acc_seg: 92.4189, loss: 0.1921, grad_norm: 1.7880
2023-02-19 12:06:43,908 - mmseg - INFO - Iter [101350/160000]	lr: 2.199e-05, eta: 4:39:26, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1140, decode.acc_seg: 94.9712, aux.loss_ce: 0.0649, aux.acc_seg: 92.9917, loss: 0.1789, grad_norm: 1.3461
2023-02-19 12:06:58,328 - mmseg - INFO - Iter [101400/160000]	lr: 2.198e-05, eta: 4:39:12, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1198, decode.acc_seg: 95.0229, aux.loss_ce: 0.0700, aux.acc_seg: 92.6672, loss: 0.1898, grad_norm: 1.6250
2023-02-19 12:07:11,929 - mmseg - INFO - Iter [101450/160000]	lr: 2.196e-05, eta: 4:38:57, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1174, decode.acc_seg: 94.8744, aux.loss_ce: 0.0689, aux.acc_seg: 92.5775, loss: 0.1863, grad_norm: 1.6248
2023-02-19 12:07:25,858 - mmseg - INFO - Iter [101500/160000]	lr: 2.194e-05, eta: 4:38:43, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1137, decode.acc_seg: 95.0987, aux.loss_ce: 0.0673, aux.acc_seg: 92.8383, loss: 0.1810, grad_norm: 1.5143
2023-02-19 12:07:39,559 - mmseg - INFO - Iter [101550/160000]	lr: 2.192e-05, eta: 4:38:28, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1206, decode.acc_seg: 94.7663, aux.loss_ce: 0.0701, aux.acc_seg: 92.5090, loss: 0.1908, grad_norm: 1.9374
2023-02-19 12:07:53,570 - mmseg - INFO - Iter [101600/160000]	lr: 2.190e-05, eta: 4:38:14, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1196, decode.acc_seg: 94.8750, aux.loss_ce: 0.0702, aux.acc_seg: 92.6689, loss: 0.1897, grad_norm: 1.4902
2023-02-19 12:08:07,525 - mmseg - INFO - Iter [101650/160000]	lr: 2.188e-05, eta: 4:37:59, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1161, decode.acc_seg: 95.0433, aux.loss_ce: 0.0671, aux.acc_seg: 93.0220, loss: 0.1832, grad_norm: 1.7728
2023-02-19 12:08:21,390 - mmseg - INFO - Iter [101700/160000]	lr: 2.186e-05, eta: 4:37:45, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1162, decode.acc_seg: 94.9956, aux.loss_ce: 0.0684, aux.acc_seg: 92.7117, loss: 0.1847, grad_norm: 1.6768
2023-02-19 12:08:34,942 - mmseg - INFO - Iter [101750/160000]	lr: 2.184e-05, eta: 4:37:30, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1166, decode.acc_seg: 95.0474, aux.loss_ce: 0.0709, aux.acc_seg: 92.5551, loss: 0.1875, grad_norm: 1.6810
2023-02-19 12:08:48,795 - mmseg - INFO - Iter [101800/160000]	lr: 2.183e-05, eta: 4:37:15, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1131, decode.acc_seg: 95.0152, aux.loss_ce: 0.0707, aux.acc_seg: 92.5339, loss: 0.1838, grad_norm: 2.7977
2023-02-19 12:09:02,384 - mmseg - INFO - Iter [101850/160000]	lr: 2.181e-05, eta: 4:37:01, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1185, decode.acc_seg: 94.9194, aux.loss_ce: 0.0695, aux.acc_seg: 92.8475, loss: 0.1881, grad_norm: 1.7152
2023-02-19 12:09:16,141 - mmseg - INFO - Iter [101900/160000]	lr: 2.179e-05, eta: 4:36:46, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1229, decode.acc_seg: 94.6234, aux.loss_ce: 0.0717, aux.acc_seg: 92.4268, loss: 0.1945, grad_norm: 1.4983
2023-02-19 12:09:29,710 - mmseg - INFO - Iter [101950/160000]	lr: 2.177e-05, eta: 4:36:31, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1088, decode.acc_seg: 95.4106, aux.loss_ce: 0.0655, aux.acc_seg: 93.1862, loss: 0.1743, grad_norm: 1.6861
2023-02-19 12:09:44,080 - mmseg - INFO - Saving checkpoint at 102000 iterations
2023-02-19 12:09:47,306 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:09:47,306 - mmseg - INFO - Iter [102000/160000]	lr: 2.175e-05, eta: 4:36:19, time: 0.352, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1152, decode.acc_seg: 94.9394, aux.loss_ce: 0.0670, aux.acc_seg: 92.7028, loss: 0.1823, grad_norm: 1.7889
2023-02-19 12:10:01,640 - mmseg - INFO - Iter [102050/160000]	lr: 2.173e-05, eta: 4:36:05, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1222, decode.acc_seg: 94.8379, aux.loss_ce: 0.0718, aux.acc_seg: 92.5215, loss: 0.1940, grad_norm: 1.8271
2023-02-19 12:10:15,694 - mmseg - INFO - Iter [102100/160000]	lr: 2.171e-05, eta: 4:35:50, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1127, decode.acc_seg: 95.0961, aux.loss_ce: 0.0665, aux.acc_seg: 92.8838, loss: 0.1792, grad_norm: 1.8319
2023-02-19 12:10:29,478 - mmseg - INFO - Iter [102150/160000]	lr: 2.169e-05, eta: 4:35:36, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1121, decode.acc_seg: 95.0503, aux.loss_ce: 0.0677, aux.acc_seg: 92.7756, loss: 0.1798, grad_norm: 1.5909
2023-02-19 12:10:43,426 - mmseg - INFO - Iter [102200/160000]	lr: 2.168e-05, eta: 4:35:21, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1126, decode.acc_seg: 95.0507, aux.loss_ce: 0.0664, aux.acc_seg: 92.9088, loss: 0.1789, grad_norm: 1.6952
2023-02-19 12:10:57,734 - mmseg - INFO - Iter [102250/160000]	lr: 2.166e-05, eta: 4:35:07, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1124, decode.acc_seg: 94.9815, aux.loss_ce: 0.0678, aux.acc_seg: 92.6043, loss: 0.1802, grad_norm: 1.5091
2023-02-19 12:11:11,352 - mmseg - INFO - Iter [102300/160000]	lr: 2.164e-05, eta: 4:34:52, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1155, decode.acc_seg: 94.9946, aux.loss_ce: 0.0680, aux.acc_seg: 92.7227, loss: 0.1835, grad_norm: 1.4921
2023-02-19 12:11:27,207 - mmseg - INFO - Iter [102350/160000]	lr: 2.162e-05, eta: 4:34:39, time: 0.317, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1136, decode.acc_seg: 94.9343, aux.loss_ce: 0.0654, aux.acc_seg: 92.8710, loss: 0.1790, grad_norm: 1.4860
2023-02-19 12:11:40,772 - mmseg - INFO - Iter [102400/160000]	lr: 2.160e-05, eta: 4:34:24, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1152, decode.acc_seg: 95.0510, aux.loss_ce: 0.0664, aux.acc_seg: 93.0222, loss: 0.1815, grad_norm: 1.6792
2023-02-19 12:11:54,933 - mmseg - INFO - Iter [102450/160000]	lr: 2.158e-05, eta: 4:34:10, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1140, decode.acc_seg: 95.0091, aux.loss_ce: 0.0686, aux.acc_seg: 92.6672, loss: 0.1826, grad_norm: 1.5599
2023-02-19 12:12:08,995 - mmseg - INFO - Iter [102500/160000]	lr: 2.156e-05, eta: 4:33:55, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1078, decode.acc_seg: 95.2422, aux.loss_ce: 0.0633, aux.acc_seg: 93.1905, loss: 0.1711, grad_norm: 1.4787
2023-02-19 12:12:24,404 - mmseg - INFO - Iter [102550/160000]	lr: 2.154e-05, eta: 4:33:42, time: 0.308, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1102, decode.acc_seg: 95.0690, aux.loss_ce: 0.0658, aux.acc_seg: 92.8691, loss: 0.1759, grad_norm: 1.4593
2023-02-19 12:12:38,484 - mmseg - INFO - Iter [102600/160000]	lr: 2.153e-05, eta: 4:33:27, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1134, decode.acc_seg: 95.1988, aux.loss_ce: 0.0674, aux.acc_seg: 92.9866, loss: 0.1808, grad_norm: 2.3677
2023-02-19 12:12:52,149 - mmseg - INFO - Iter [102650/160000]	lr: 2.151e-05, eta: 4:33:13, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1123, decode.acc_seg: 95.0313, aux.loss_ce: 0.0657, aux.acc_seg: 92.9081, loss: 0.1780, grad_norm: 1.4791
2023-02-19 12:13:06,228 - mmseg - INFO - Iter [102700/160000]	lr: 2.149e-05, eta: 4:32:58, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1094, decode.acc_seg: 95.2643, aux.loss_ce: 0.0640, aux.acc_seg: 93.1379, loss: 0.1734, grad_norm: 1.5172
2023-02-19 12:13:20,173 - mmseg - INFO - Iter [102750/160000]	lr: 2.147e-05, eta: 4:32:44, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1141, decode.acc_seg: 95.0222, aux.loss_ce: 0.0675, aux.acc_seg: 92.7303, loss: 0.1815, grad_norm: 1.4805
2023-02-19 12:13:34,091 - mmseg - INFO - Iter [102800/160000]	lr: 2.145e-05, eta: 4:32:29, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1218, decode.acc_seg: 94.7899, aux.loss_ce: 0.0715, aux.acc_seg: 92.5135, loss: 0.1932, grad_norm: 1.5452
2023-02-19 12:13:48,234 - mmseg - INFO - Iter [102850/160000]	lr: 2.143e-05, eta: 4:32:15, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1154, decode.acc_seg: 94.9738, aux.loss_ce: 0.0676, aux.acc_seg: 92.8142, loss: 0.1829, grad_norm: 1.5970
2023-02-19 12:14:01,984 - mmseg - INFO - Iter [102900/160000]	lr: 2.141e-05, eta: 4:32:00, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1135, decode.acc_seg: 95.0454, aux.loss_ce: 0.0675, aux.acc_seg: 92.8237, loss: 0.1810, grad_norm: 1.3451
2023-02-19 12:14:16,556 - mmseg - INFO - Iter [102950/160000]	lr: 2.139e-05, eta: 4:31:46, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1144, decode.acc_seg: 95.0633, aux.loss_ce: 0.0657, aux.acc_seg: 93.0654, loss: 0.1800, grad_norm: 1.5296
2023-02-19 12:14:30,322 - mmseg - INFO - Saving checkpoint at 103000 iterations
2023-02-19 12:14:33,547 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:14:33,547 - mmseg - INFO - Iter [103000/160000]	lr: 2.138e-05, eta: 4:31:33, time: 0.340, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1158, decode.acc_seg: 94.8568, aux.loss_ce: 0.0662, aux.acc_seg: 92.8708, loss: 0.1820, grad_norm: 1.7856
2023-02-19 12:14:47,293 - mmseg - INFO - Iter [103050/160000]	lr: 2.136e-05, eta: 4:31:19, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1088, decode.acc_seg: 95.2315, aux.loss_ce: 0.0652, aux.acc_seg: 93.0145, loss: 0.1740, grad_norm: 1.4672
2023-02-19 12:15:01,361 - mmseg - INFO - Iter [103100/160000]	lr: 2.134e-05, eta: 4:31:04, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1149, decode.acc_seg: 95.2458, aux.loss_ce: 0.0665, aux.acc_seg: 93.1707, loss: 0.1814, grad_norm: 1.5607
2023-02-19 12:15:14,977 - mmseg - INFO - Iter [103150/160000]	lr: 2.132e-05, eta: 4:30:50, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1222, decode.acc_seg: 94.6717, aux.loss_ce: 0.0704, aux.acc_seg: 92.4963, loss: 0.1926, grad_norm: 1.4785
2023-02-19 12:15:28,544 - mmseg - INFO - Iter [103200/160000]	lr: 2.130e-05, eta: 4:30:35, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1208, decode.acc_seg: 94.7183, aux.loss_ce: 0.0697, aux.acc_seg: 92.4041, loss: 0.1905, grad_norm: 1.9760
2023-02-19 12:15:42,640 - mmseg - INFO - Iter [103250/160000]	lr: 2.128e-05, eta: 4:30:21, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1154, decode.acc_seg: 95.0250, aux.loss_ce: 0.0690, aux.acc_seg: 92.7316, loss: 0.1844, grad_norm: 1.8493
2023-02-19 12:15:56,490 - mmseg - INFO - Iter [103300/160000]	lr: 2.126e-05, eta: 4:30:06, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1139, decode.acc_seg: 95.0883, aux.loss_ce: 0.0676, aux.acc_seg: 92.9266, loss: 0.1815, grad_norm: 1.6709
2023-02-19 12:16:10,273 - mmseg - INFO - Iter [103350/160000]	lr: 2.124e-05, eta: 4:29:51, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1105, decode.acc_seg: 95.1523, aux.loss_ce: 0.0688, aux.acc_seg: 92.5837, loss: 0.1793, grad_norm: 1.5200
2023-02-19 12:16:24,237 - mmseg - INFO - Iter [103400/160000]	lr: 2.123e-05, eta: 4:29:37, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1238, decode.acc_seg: 94.6147, aux.loss_ce: 0.0700, aux.acc_seg: 92.4361, loss: 0.1938, grad_norm: 1.6252
2023-02-19 12:16:38,439 - mmseg - INFO - Iter [103450/160000]	lr: 2.121e-05, eta: 4:29:23, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1189, decode.acc_seg: 94.9540, aux.loss_ce: 0.0711, aux.acc_seg: 92.5712, loss: 0.1901, grad_norm: 2.1302
2023-02-19 12:16:51,991 - mmseg - INFO - Iter [103500/160000]	lr: 2.119e-05, eta: 4:29:08, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1127, decode.acc_seg: 95.1105, aux.loss_ce: 0.0658, aux.acc_seg: 93.0046, loss: 0.1785, grad_norm: 1.4509
2023-02-19 12:17:05,611 - mmseg - INFO - Iter [103550/160000]	lr: 2.117e-05, eta: 4:28:53, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1143, decode.acc_seg: 94.9976, aux.loss_ce: 0.0664, aux.acc_seg: 92.8887, loss: 0.1807, grad_norm: 1.5070
2023-02-19 12:17:21,582 - mmseg - INFO - Iter [103600/160000]	lr: 2.115e-05, eta: 4:28:40, time: 0.319, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1145, decode.acc_seg: 94.8923, aux.loss_ce: 0.0669, aux.acc_seg: 92.7302, loss: 0.1814, grad_norm: 1.5416
2023-02-19 12:17:35,564 - mmseg - INFO - Iter [103650/160000]	lr: 2.113e-05, eta: 4:28:25, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1183, decode.acc_seg: 94.8877, aux.loss_ce: 0.0698, aux.acc_seg: 92.5424, loss: 0.1881, grad_norm: 1.8201
2023-02-19 12:17:49,321 - mmseg - INFO - Iter [103700/160000]	lr: 2.111e-05, eta: 4:28:11, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1137, decode.acc_seg: 95.0589, aux.loss_ce: 0.0668, aux.acc_seg: 92.8618, loss: 0.1805, grad_norm: 1.6252
2023-02-19 12:18:03,476 - mmseg - INFO - Iter [103750/160000]	lr: 2.109e-05, eta: 4:27:56, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1215, decode.acc_seg: 94.5752, aux.loss_ce: 0.0689, aux.acc_seg: 92.5552, loss: 0.1904, grad_norm: 1.5191
2023-02-19 12:18:18,917 - mmseg - INFO - Iter [103800/160000]	lr: 2.108e-05, eta: 4:27:43, time: 0.309, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1095, decode.acc_seg: 95.1128, aux.loss_ce: 0.0668, aux.acc_seg: 92.7385, loss: 0.1764, grad_norm: 1.7203
2023-02-19 12:18:32,552 - mmseg - INFO - Iter [103850/160000]	lr: 2.106e-05, eta: 4:27:28, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1159, decode.acc_seg: 95.0201, aux.loss_ce: 0.0678, aux.acc_seg: 92.7912, loss: 0.1837, grad_norm: 1.4917
2023-02-19 12:18:46,455 - mmseg - INFO - Iter [103900/160000]	lr: 2.104e-05, eta: 4:27:14, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1153, decode.acc_seg: 94.9304, aux.loss_ce: 0.0668, aux.acc_seg: 92.7971, loss: 0.1821, grad_norm: 2.0784
2023-02-19 12:19:00,301 - mmseg - INFO - Iter [103950/160000]	lr: 2.102e-05, eta: 4:26:59, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1106, decode.acc_seg: 95.1439, aux.loss_ce: 0.0644, aux.acc_seg: 93.0624, loss: 0.1750, grad_norm: 1.2161
2023-02-19 12:19:14,759 - mmseg - INFO - Saving checkpoint at 104000 iterations
2023-02-19 12:19:18,133 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:19:18,133 - mmseg - INFO - Iter [104000/160000]	lr: 2.100e-05, eta: 4:26:47, time: 0.357, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1188, decode.acc_seg: 95.0327, aux.loss_ce: 0.0695, aux.acc_seg: 92.8160, loss: 0.1883, grad_norm: 1.9791
2023-02-19 12:19:31,808 - mmseg - INFO - Iter [104050/160000]	lr: 2.098e-05, eta: 4:26:32, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1151, decode.acc_seg: 94.9266, aux.loss_ce: 0.0685, aux.acc_seg: 92.7187, loss: 0.1836, grad_norm: 2.0564
2023-02-19 12:19:45,450 - mmseg - INFO - Iter [104100/160000]	lr: 2.096e-05, eta: 4:26:17, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1095, decode.acc_seg: 95.1824, aux.loss_ce: 0.0653, aux.acc_seg: 92.9923, loss: 0.1749, grad_norm: 1.4265
2023-02-19 12:19:59,270 - mmseg - INFO - Iter [104150/160000]	lr: 2.094e-05, eta: 4:26:03, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1108, decode.acc_seg: 95.1493, aux.loss_ce: 0.0644, aux.acc_seg: 93.1512, loss: 0.1752, grad_norm: 1.4765
2023-02-19 12:20:13,891 - mmseg - INFO - Iter [104200/160000]	lr: 2.093e-05, eta: 4:25:49, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1165, decode.acc_seg: 95.0171, aux.loss_ce: 0.0682, aux.acc_seg: 92.8199, loss: 0.1847, grad_norm: 1.4099
2023-02-19 12:20:27,969 - mmseg - INFO - Iter [104250/160000]	lr: 2.091e-05, eta: 4:25:34, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1077, decode.acc_seg: 95.2339, aux.loss_ce: 0.0627, aux.acc_seg: 93.1967, loss: 0.1704, grad_norm: 1.4544
2023-02-19 12:20:41,656 - mmseg - INFO - Iter [104300/160000]	lr: 2.089e-05, eta: 4:25:20, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1093, decode.acc_seg: 95.0722, aux.loss_ce: 0.0659, aux.acc_seg: 92.8232, loss: 0.1753, grad_norm: 1.8772
2023-02-19 12:20:55,526 - mmseg - INFO - Iter [104350/160000]	lr: 2.087e-05, eta: 4:25:05, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1016, decode.acc_seg: 95.5833, aux.loss_ce: 0.0616, aux.acc_seg: 93.4395, loss: 0.1632, grad_norm: 1.7668
2023-02-19 12:21:09,755 - mmseg - INFO - Iter [104400/160000]	lr: 2.085e-05, eta: 4:24:51, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1139, decode.acc_seg: 95.0048, aux.loss_ce: 0.0665, aux.acc_seg: 92.8125, loss: 0.1804, grad_norm: 1.6611
2023-02-19 12:21:23,366 - mmseg - INFO - Iter [104450/160000]	lr: 2.083e-05, eta: 4:24:36, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1099, decode.acc_seg: 95.2279, aux.loss_ce: 0.0662, aux.acc_seg: 93.0170, loss: 0.1760, grad_norm: 1.4846
2023-02-19 12:21:37,061 - mmseg - INFO - Iter [104500/160000]	lr: 2.081e-05, eta: 4:24:22, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1092, decode.acc_seg: 95.2849, aux.loss_ce: 0.0655, aux.acc_seg: 93.0077, loss: 0.1747, grad_norm: 1.5022
2023-02-19 12:21:51,451 - mmseg - INFO - Iter [104550/160000]	lr: 2.079e-05, eta: 4:24:07, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1182, decode.acc_seg: 94.8303, aux.loss_ce: 0.0676, aux.acc_seg: 92.7913, loss: 0.1858, grad_norm: 1.7989
2023-02-19 12:22:05,029 - mmseg - INFO - Iter [104600/160000]	lr: 2.078e-05, eta: 4:23:53, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1261, decode.acc_seg: 94.7478, aux.loss_ce: 0.0721, aux.acc_seg: 92.3221, loss: 0.1982, grad_norm: 2.4043
2023-02-19 12:22:20,723 - mmseg - INFO - Iter [104650/160000]	lr: 2.076e-05, eta: 4:23:39, time: 0.314, data_time: 0.007, memory: 15214, decode.loss_ce: 0.1177, decode.acc_seg: 94.8214, aux.loss_ce: 0.0687, aux.acc_seg: 92.6023, loss: 0.1864, grad_norm: 1.7311
2023-02-19 12:22:34,472 - mmseg - INFO - Iter [104700/160000]	lr: 2.074e-05, eta: 4:23:25, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1112, decode.acc_seg: 95.0206, aux.loss_ce: 0.0657, aux.acc_seg: 92.6647, loss: 0.1770, grad_norm: 1.5748
2023-02-19 12:22:48,649 - mmseg - INFO - Iter [104750/160000]	lr: 2.072e-05, eta: 4:23:10, time: 0.284, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1180, decode.acc_seg: 94.8355, aux.loss_ce: 0.0687, aux.acc_seg: 92.4769, loss: 0.1867, grad_norm: 1.5910
2023-02-19 12:23:02,312 - mmseg - INFO - Iter [104800/160000]	lr: 2.070e-05, eta: 4:22:56, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1134, decode.acc_seg: 95.0398, aux.loss_ce: 0.0649, aux.acc_seg: 93.0850, loss: 0.1783, grad_norm: 1.4955
2023-02-19 12:23:18,681 - mmseg - INFO - Iter [104850/160000]	lr: 2.068e-05, eta: 4:22:42, time: 0.327, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1116, decode.acc_seg: 95.1529, aux.loss_ce: 0.0657, aux.acc_seg: 93.0729, loss: 0.1773, grad_norm: 1.3964
2023-02-19 12:23:32,428 - mmseg - INFO - Iter [104900/160000]	lr: 2.066e-05, eta: 4:22:28, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1115, decode.acc_seg: 95.1590, aux.loss_ce: 0.0655, aux.acc_seg: 93.0534, loss: 0.1770, grad_norm: 1.6833
2023-02-19 12:23:46,282 - mmseg - INFO - Iter [104950/160000]	lr: 2.064e-05, eta: 4:22:13, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1055, decode.acc_seg: 95.3131, aux.loss_ce: 0.0646, aux.acc_seg: 92.9789, loss: 0.1701, grad_norm: 1.5909
2023-02-19 12:24:00,500 - mmseg - INFO - Saving checkpoint at 105000 iterations
2023-02-19 12:24:03,726 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:24:03,726 - mmseg - INFO - Iter [105000/160000]	lr: 2.063e-05, eta: 4:22:01, time: 0.349, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1236, decode.acc_seg: 94.7443, aux.loss_ce: 0.0736, aux.acc_seg: 92.3440, loss: 0.1972, grad_norm: 1.6510
2023-02-19 12:24:17,576 - mmseg - INFO - Iter [105050/160000]	lr: 2.061e-05, eta: 4:21:46, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1206, decode.acc_seg: 94.8168, aux.loss_ce: 0.0701, aux.acc_seg: 92.6269, loss: 0.1907, grad_norm: 1.5581
2023-02-19 12:24:31,641 - mmseg - INFO - Iter [105100/160000]	lr: 2.059e-05, eta: 4:21:32, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1162, decode.acc_seg: 94.9904, aux.loss_ce: 0.0702, aux.acc_seg: 92.5376, loss: 0.1864, grad_norm: 1.9678
2023-02-19 12:24:46,423 - mmseg - INFO - Iter [105150/160000]	lr: 2.057e-05, eta: 4:21:18, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1101, decode.acc_seg: 95.2369, aux.loss_ce: 0.0647, aux.acc_seg: 93.0943, loss: 0.1748, grad_norm: 1.3599
2023-02-19 12:25:00,282 - mmseg - INFO - Iter [105200/160000]	lr: 2.055e-05, eta: 4:21:03, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1177, decode.acc_seg: 94.8712, aux.loss_ce: 0.0687, aux.acc_seg: 92.7256, loss: 0.1865, grad_norm: 2.2458
2023-02-19 12:25:13,938 - mmseg - INFO - Iter [105250/160000]	lr: 2.053e-05, eta: 4:20:49, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1100, decode.acc_seg: 95.2257, aux.loss_ce: 0.0659, aux.acc_seg: 92.9142, loss: 0.1759, grad_norm: 1.4994
2023-02-19 12:25:27,588 - mmseg - INFO - Iter [105300/160000]	lr: 2.051e-05, eta: 4:20:34, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1142, decode.acc_seg: 94.9832, aux.loss_ce: 0.0666, aux.acc_seg: 92.8061, loss: 0.1809, grad_norm: 1.4647
2023-02-19 12:25:41,461 - mmseg - INFO - Iter [105350/160000]	lr: 2.049e-05, eta: 4:20:20, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1177, decode.acc_seg: 94.8390, aux.loss_ce: 0.0700, aux.acc_seg: 92.5119, loss: 0.1877, grad_norm: 1.5933
2023-02-19 12:25:55,421 - mmseg - INFO - Iter [105400/160000]	lr: 2.048e-05, eta: 4:20:05, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1080, decode.acc_seg: 95.2840, aux.loss_ce: 0.0628, aux.acc_seg: 93.2765, loss: 0.1708, grad_norm: 1.4525
2023-02-19 12:26:09,292 - mmseg - INFO - Iter [105450/160000]	lr: 2.046e-05, eta: 4:19:51, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1088, decode.acc_seg: 95.2621, aux.loss_ce: 0.0650, aux.acc_seg: 92.9813, loss: 0.1738, grad_norm: 1.5669
2023-02-19 12:26:23,474 - mmseg - INFO - Iter [105500/160000]	lr: 2.044e-05, eta: 4:19:36, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1139, decode.acc_seg: 95.1465, aux.loss_ce: 0.0695, aux.acc_seg: 92.7710, loss: 0.1834, grad_norm: 1.5968
2023-02-19 12:26:37,453 - mmseg - INFO - Iter [105550/160000]	lr: 2.042e-05, eta: 4:19:22, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1065, decode.acc_seg: 95.3246, aux.loss_ce: 0.0636, aux.acc_seg: 93.2044, loss: 0.1702, grad_norm: 1.4697
2023-02-19 12:26:51,114 - mmseg - INFO - Iter [105600/160000]	lr: 2.040e-05, eta: 4:19:07, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1146, decode.acc_seg: 95.0907, aux.loss_ce: 0.0667, aux.acc_seg: 92.9971, loss: 0.1813, grad_norm: 1.8156
2023-02-19 12:27:04,806 - mmseg - INFO - Iter [105650/160000]	lr: 2.038e-05, eta: 4:18:53, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1053, decode.acc_seg: 95.4654, aux.loss_ce: 0.0638, aux.acc_seg: 93.3363, loss: 0.1691, grad_norm: 1.5878
2023-02-19 12:27:19,363 - mmseg - INFO - Iter [105700/160000]	lr: 2.036e-05, eta: 4:18:38, time: 0.291, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1175, decode.acc_seg: 94.9073, aux.loss_ce: 0.0681, aux.acc_seg: 92.8806, loss: 0.1856, grad_norm: 2.1414
2023-02-19 12:27:33,108 - mmseg - INFO - Iter [105750/160000]	lr: 2.034e-05, eta: 4:18:24, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1177, decode.acc_seg: 94.9191, aux.loss_ce: 0.0689, aux.acc_seg: 92.7842, loss: 0.1865, grad_norm: 2.2679
2023-02-19 12:27:46,782 - mmseg - INFO - Iter [105800/160000]	lr: 2.033e-05, eta: 4:18:09, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1079, decode.acc_seg: 95.2801, aux.loss_ce: 0.0632, aux.acc_seg: 93.2282, loss: 0.1711, grad_norm: 1.4848
2023-02-19 12:28:00,480 - mmseg - INFO - Iter [105850/160000]	lr: 2.031e-05, eta: 4:17:55, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1098, decode.acc_seg: 95.2984, aux.loss_ce: 0.0644, aux.acc_seg: 93.2572, loss: 0.1743, grad_norm: 1.4481
2023-02-19 12:28:14,110 - mmseg - INFO - Iter [105900/160000]	lr: 2.029e-05, eta: 4:17:40, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1143, decode.acc_seg: 95.0240, aux.loss_ce: 0.0670, aux.acc_seg: 92.8522, loss: 0.1813, grad_norm: 1.8587
2023-02-19 12:28:27,753 - mmseg - INFO - Iter [105950/160000]	lr: 2.027e-05, eta: 4:17:25, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1121, decode.acc_seg: 95.1003, aux.loss_ce: 0.0671, aux.acc_seg: 92.8400, loss: 0.1793, grad_norm: 1.5256
2023-02-19 12:28:41,563 - mmseg - INFO - Saving checkpoint at 106000 iterations
2023-02-19 12:28:44,786 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:28:44,786 - mmseg - INFO - Iter [106000/160000]	lr: 2.025e-05, eta: 4:17:12, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1173, decode.acc_seg: 95.0018, aux.loss_ce: 0.0680, aux.acc_seg: 92.8054, loss: 0.1854, grad_norm: 1.8012
2023-02-19 12:28:59,479 - mmseg - INFO - Iter [106050/160000]	lr: 2.023e-05, eta: 4:16:58, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1135, decode.acc_seg: 95.0829, aux.loss_ce: 0.0659, aux.acc_seg: 92.9849, loss: 0.1794, grad_norm: 1.7530
2023-02-19 12:29:16,422 - mmseg - INFO - Iter [106100/160000]	lr: 2.021e-05, eta: 4:16:45, time: 0.339, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1148, decode.acc_seg: 95.0396, aux.loss_ce: 0.0666, aux.acc_seg: 93.0031, loss: 0.1814, grad_norm: 1.6297
2023-02-19 12:29:30,194 - mmseg - INFO - Iter [106150/160000]	lr: 2.019e-05, eta: 4:16:31, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1164, decode.acc_seg: 95.0020, aux.loss_ce: 0.0674, aux.acc_seg: 92.8875, loss: 0.1838, grad_norm: 2.7143
2023-02-19 12:29:43,923 - mmseg - INFO - Iter [106200/160000]	lr: 2.018e-05, eta: 4:16:16, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1135, decode.acc_seg: 95.1124, aux.loss_ce: 0.0677, aux.acc_seg: 92.8761, loss: 0.1812, grad_norm: 1.8982
2023-02-19 12:29:57,690 - mmseg - INFO - Iter [106250/160000]	lr: 2.016e-05, eta: 4:16:02, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1173, decode.acc_seg: 94.9779, aux.loss_ce: 0.0719, aux.acc_seg: 92.5850, loss: 0.1892, grad_norm: 1.7852
2023-02-19 12:30:11,791 - mmseg - INFO - Iter [106300/160000]	lr: 2.014e-05, eta: 4:15:47, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1124, decode.acc_seg: 95.0354, aux.loss_ce: 0.0676, aux.acc_seg: 92.8142, loss: 0.1800, grad_norm: 1.5393
2023-02-19 12:30:25,933 - mmseg - INFO - Iter [106350/160000]	lr: 2.012e-05, eta: 4:15:33, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1121, decode.acc_seg: 95.0712, aux.loss_ce: 0.0659, aux.acc_seg: 92.8903, loss: 0.1780, grad_norm: 1.4457
2023-02-19 12:30:40,092 - mmseg - INFO - Iter [106400/160000]	lr: 2.010e-05, eta: 4:15:19, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1188, decode.acc_seg: 95.0000, aux.loss_ce: 0.0683, aux.acc_seg: 92.9521, loss: 0.1871, grad_norm: 1.9909
2023-02-19 12:30:54,398 - mmseg - INFO - Iter [106450/160000]	lr: 2.008e-05, eta: 4:15:04, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1107, decode.acc_seg: 95.1633, aux.loss_ce: 0.0662, aux.acc_seg: 92.9799, loss: 0.1769, grad_norm: 1.3344
2023-02-19 12:31:09,305 - mmseg - INFO - Iter [106500/160000]	lr: 2.006e-05, eta: 4:14:50, time: 0.298, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1107, decode.acc_seg: 95.0890, aux.loss_ce: 0.0686, aux.acc_seg: 92.6764, loss: 0.1793, grad_norm: 1.7342
2023-02-19 12:31:22,917 - mmseg - INFO - Iter [106550/160000]	lr: 2.004e-05, eta: 4:14:36, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1157, decode.acc_seg: 95.0633, aux.loss_ce: 0.0712, aux.acc_seg: 92.6227, loss: 0.1869, grad_norm: 1.8430
2023-02-19 12:31:37,205 - mmseg - INFO - Iter [106600/160000]	lr: 2.003e-05, eta: 4:14:21, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1104, decode.acc_seg: 95.2202, aux.loss_ce: 0.0672, aux.acc_seg: 92.8369, loss: 0.1776, grad_norm: 1.6841
2023-02-19 12:31:51,138 - mmseg - INFO - Iter [106650/160000]	lr: 2.001e-05, eta: 4:14:07, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1100, decode.acc_seg: 95.2357, aux.loss_ce: 0.0646, aux.acc_seg: 93.0579, loss: 0.1746, grad_norm: 1.3828
2023-02-19 12:32:04,808 - mmseg - INFO - Iter [106700/160000]	lr: 1.999e-05, eta: 4:13:52, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1090, decode.acc_seg: 95.2612, aux.loss_ce: 0.0630, aux.acc_seg: 93.2255, loss: 0.1720, grad_norm: 1.5799
2023-02-19 12:32:19,176 - mmseg - INFO - Iter [106750/160000]	lr: 1.997e-05, eta: 4:13:38, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1058, decode.acc_seg: 95.3354, aux.loss_ce: 0.0635, aux.acc_seg: 93.0979, loss: 0.1693, grad_norm: 1.3095
2023-02-19 12:32:32,741 - mmseg - INFO - Iter [106800/160000]	lr: 1.995e-05, eta: 4:13:24, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1120, decode.acc_seg: 95.1916, aux.loss_ce: 0.0670, aux.acc_seg: 93.0309, loss: 0.1790, grad_norm: 1.5156
2023-02-19 12:32:46,793 - mmseg - INFO - Iter [106850/160000]	lr: 1.993e-05, eta: 4:13:09, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1169, decode.acc_seg: 94.7656, aux.loss_ce: 0.0667, aux.acc_seg: 92.6372, loss: 0.1836, grad_norm: 1.5528
2023-02-19 12:33:01,453 - mmseg - INFO - Iter [106900/160000]	lr: 1.991e-05, eta: 4:12:55, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1068, decode.acc_seg: 95.3342, aux.loss_ce: 0.0649, aux.acc_seg: 93.1140, loss: 0.1717, grad_norm: 1.3547
2023-02-19 12:33:15,283 - mmseg - INFO - Iter [106950/160000]	lr: 1.989e-05, eta: 4:12:40, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1095, decode.acc_seg: 95.3047, aux.loss_ce: 0.0656, aux.acc_seg: 93.1089, loss: 0.1751, grad_norm: 1.6104
2023-02-19 12:33:28,877 - mmseg - INFO - Saving checkpoint at 107000 iterations
2023-02-19 12:33:32,127 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:33:32,127 - mmseg - INFO - Iter [107000/160000]	lr: 1.988e-05, eta: 4:12:27, time: 0.337, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1122, decode.acc_seg: 95.3017, aux.loss_ce: 0.0677, aux.acc_seg: 92.9477, loss: 0.1799, grad_norm: 1.6224
2023-02-19 12:33:46,051 - mmseg - INFO - Iter [107050/160000]	lr: 1.986e-05, eta: 4:12:13, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1140, decode.acc_seg: 95.0517, aux.loss_ce: 0.0683, aux.acc_seg: 92.8337, loss: 0.1824, grad_norm: 2.1264
2023-02-19 12:33:59,869 - mmseg - INFO - Iter [107100/160000]	lr: 1.984e-05, eta: 4:11:58, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1144, decode.acc_seg: 95.0379, aux.loss_ce: 0.0682, aux.acc_seg: 92.8041, loss: 0.1826, grad_norm: 1.5926
2023-02-19 12:34:14,080 - mmseg - INFO - Iter [107150/160000]	lr: 1.982e-05, eta: 4:11:44, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1103, decode.acc_seg: 95.1754, aux.loss_ce: 0.0670, aux.acc_seg: 92.8112, loss: 0.1773, grad_norm: 1.5058
2023-02-19 12:34:27,669 - mmseg - INFO - Iter [107200/160000]	lr: 1.980e-05, eta: 4:11:29, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1146, decode.acc_seg: 95.0186, aux.loss_ce: 0.0665, aux.acc_seg: 92.9416, loss: 0.1811, grad_norm: 1.5011
2023-02-19 12:34:41,392 - mmseg - INFO - Iter [107250/160000]	lr: 1.978e-05, eta: 4:11:15, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1137, decode.acc_seg: 94.9856, aux.loss_ce: 0.0663, aux.acc_seg: 92.8290, loss: 0.1800, grad_norm: 1.4194
2023-02-19 12:34:54,947 - mmseg - INFO - Iter [107300/160000]	lr: 1.976e-05, eta: 4:11:00, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1108, decode.acc_seg: 95.0135, aux.loss_ce: 0.0640, aux.acc_seg: 92.9978, loss: 0.1748, grad_norm: 1.5327
2023-02-19 12:35:08,486 - mmseg - INFO - Iter [107350/160000]	lr: 1.974e-05, eta: 4:10:46, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1102, decode.acc_seg: 95.2212, aux.loss_ce: 0.0660, aux.acc_seg: 92.8686, loss: 0.1762, grad_norm: 1.8412
2023-02-19 12:35:24,631 - mmseg - INFO - Iter [107400/160000]	lr: 1.973e-05, eta: 4:10:32, time: 0.323, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1112, decode.acc_seg: 95.0977, aux.loss_ce: 0.0650, aux.acc_seg: 92.9070, loss: 0.1762, grad_norm: 1.8063
2023-02-19 12:35:38,297 - mmseg - INFO - Iter [107450/160000]	lr: 1.971e-05, eta: 4:10:18, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1189, decode.acc_seg: 94.8684, aux.loss_ce: 0.0707, aux.acc_seg: 92.5617, loss: 0.1896, grad_norm: 1.8656
2023-02-19 12:35:53,406 - mmseg - INFO - Iter [107500/160000]	lr: 1.969e-05, eta: 4:10:04, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1150, decode.acc_seg: 94.8666, aux.loss_ce: 0.0662, aux.acc_seg: 92.8663, loss: 0.1812, grad_norm: 1.6521
2023-02-19 12:36:07,740 - mmseg - INFO - Iter [107550/160000]	lr: 1.967e-05, eta: 4:09:49, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1084, decode.acc_seg: 95.3234, aux.loss_ce: 0.0649, aux.acc_seg: 93.1556, loss: 0.1733, grad_norm: 1.5000
2023-02-19 12:36:21,988 - mmseg - INFO - Iter [107600/160000]	lr: 1.965e-05, eta: 4:09:35, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1074, decode.acc_seg: 95.1853, aux.loss_ce: 0.0608, aux.acc_seg: 93.3483, loss: 0.1682, grad_norm: 1.3110
2023-02-19 12:36:35,864 - mmseg - INFO - Iter [107650/160000]	lr: 1.963e-05, eta: 4:09:21, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1152, decode.acc_seg: 94.8775, aux.loss_ce: 0.0687, aux.acc_seg: 92.5710, loss: 0.1839, grad_norm: 1.8253
2023-02-19 12:36:49,811 - mmseg - INFO - Iter [107700/160000]	lr: 1.961e-05, eta: 4:09:06, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1102, decode.acc_seg: 95.0751, aux.loss_ce: 0.0663, aux.acc_seg: 92.7990, loss: 0.1766, grad_norm: 1.6890
2023-02-19 12:37:03,431 - mmseg - INFO - Iter [107750/160000]	lr: 1.959e-05, eta: 4:08:52, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1114, decode.acc_seg: 95.1139, aux.loss_ce: 0.0659, aux.acc_seg: 92.8574, loss: 0.1772, grad_norm: 1.4825
2023-02-19 12:37:17,062 - mmseg - INFO - Iter [107800/160000]	lr: 1.958e-05, eta: 4:08:37, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1101, decode.acc_seg: 95.0742, aux.loss_ce: 0.0645, aux.acc_seg: 93.0165, loss: 0.1746, grad_norm: 1.6476
2023-02-19 12:37:31,046 - mmseg - INFO - Iter [107850/160000]	lr: 1.956e-05, eta: 4:08:23, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1093, decode.acc_seg: 95.1827, aux.loss_ce: 0.0646, aux.acc_seg: 93.0472, loss: 0.1739, grad_norm: 1.8075
2023-02-19 12:37:45,185 - mmseg - INFO - Iter [107900/160000]	lr: 1.954e-05, eta: 4:08:08, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1110, decode.acc_seg: 95.1259, aux.loss_ce: 0.0663, aux.acc_seg: 92.9029, loss: 0.1772, grad_norm: 1.4572
2023-02-19 12:37:59,114 - mmseg - INFO - Iter [107950/160000]	lr: 1.952e-05, eta: 4:07:54, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1102, decode.acc_seg: 95.2179, aux.loss_ce: 0.0645, aux.acc_seg: 93.1441, loss: 0.1748, grad_norm: 1.4443
2023-02-19 12:38:12,779 - mmseg - INFO - Saving checkpoint at 108000 iterations
2023-02-19 12:38:16,051 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:38:16,051 - mmseg - INFO - Iter [108000/160000]	lr: 1.950e-05, eta: 4:07:41, time: 0.339, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1157, decode.acc_seg: 94.9834, aux.loss_ce: 0.0718, aux.acc_seg: 92.6454, loss: 0.1875, grad_norm: 1.9864
2023-02-19 12:38:30,233 - mmseg - INFO - Iter [108050/160000]	lr: 1.948e-05, eta: 4:07:26, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1077, decode.acc_seg: 95.1549, aux.loss_ce: 0.0651, aux.acc_seg: 92.7773, loss: 0.1728, grad_norm: 1.3747
2023-02-19 12:38:45,054 - mmseg - INFO - Iter [108100/160000]	lr: 1.946e-05, eta: 4:07:12, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1108, decode.acc_seg: 95.0797, aux.loss_ce: 0.0650, aux.acc_seg: 92.9742, loss: 0.1758, grad_norm: 1.4712
2023-02-19 12:38:58,699 - mmseg - INFO - Iter [108150/160000]	lr: 1.944e-05, eta: 4:06:58, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1175, decode.acc_seg: 94.9246, aux.loss_ce: 0.0703, aux.acc_seg: 92.5121, loss: 0.1877, grad_norm: 1.5533
2023-02-19 12:39:12,428 - mmseg - INFO - Iter [108200/160000]	lr: 1.943e-05, eta: 4:06:43, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1172, decode.acc_seg: 94.9187, aux.loss_ce: 0.0686, aux.acc_seg: 92.7758, loss: 0.1859, grad_norm: 2.0866
2023-02-19 12:39:26,204 - mmseg - INFO - Iter [108250/160000]	lr: 1.941e-05, eta: 4:06:29, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1121, decode.acc_seg: 94.9162, aux.loss_ce: 0.0639, aux.acc_seg: 93.0206, loss: 0.1760, grad_norm: 1.9306
2023-02-19 12:39:40,481 - mmseg - INFO - Iter [108300/160000]	lr: 1.939e-05, eta: 4:06:14, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1090, decode.acc_seg: 95.1844, aux.loss_ce: 0.0626, aux.acc_seg: 93.3071, loss: 0.1716, grad_norm: 1.5145
2023-02-19 12:39:54,151 - mmseg - INFO - Iter [108350/160000]	lr: 1.937e-05, eta: 4:06:00, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1119, decode.acc_seg: 95.0982, aux.loss_ce: 0.0656, aux.acc_seg: 92.9911, loss: 0.1775, grad_norm: 1.2608
2023-02-19 12:40:07,894 - mmseg - INFO - Iter [108400/160000]	lr: 1.935e-05, eta: 4:05:45, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1093, decode.acc_seg: 95.3420, aux.loss_ce: 0.0649, aux.acc_seg: 93.1348, loss: 0.1743, grad_norm: 1.4486
2023-02-19 12:40:21,805 - mmseg - INFO - Iter [108450/160000]	lr: 1.933e-05, eta: 4:05:31, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1130, decode.acc_seg: 95.1293, aux.loss_ce: 0.0668, aux.acc_seg: 92.9409, loss: 0.1798, grad_norm: 1.2941
2023-02-19 12:40:35,515 - mmseg - INFO - Iter [108500/160000]	lr: 1.931e-05, eta: 4:05:16, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1108, decode.acc_seg: 95.0750, aux.loss_ce: 0.0655, aux.acc_seg: 92.9733, loss: 0.1763, grad_norm: 1.1964
2023-02-19 12:40:49,402 - mmseg - INFO - Iter [108550/160000]	lr: 1.929e-05, eta: 4:05:02, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1134, decode.acc_seg: 94.9755, aux.loss_ce: 0.0667, aux.acc_seg: 92.7419, loss: 0.1801, grad_norm: 1.3566
2023-02-19 12:41:03,594 - mmseg - INFO - Iter [108600/160000]	lr: 1.928e-05, eta: 4:04:47, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1116, decode.acc_seg: 95.1612, aux.loss_ce: 0.0656, aux.acc_seg: 93.0927, loss: 0.1772, grad_norm: 2.0338
2023-02-19 12:41:19,431 - mmseg - INFO - Iter [108650/160000]	lr: 1.926e-05, eta: 4:04:34, time: 0.317, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1115, decode.acc_seg: 95.1577, aux.loss_ce: 0.0668, aux.acc_seg: 92.9184, loss: 0.1783, grad_norm: 1.4720
2023-02-19 12:41:33,218 - mmseg - INFO - Iter [108700/160000]	lr: 1.924e-05, eta: 4:04:19, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1109, decode.acc_seg: 95.2543, aux.loss_ce: 0.0656, aux.acc_seg: 93.0013, loss: 0.1765, grad_norm: 1.7042
2023-02-19 12:41:47,100 - mmseg - INFO - Iter [108750/160000]	lr: 1.922e-05, eta: 4:04:05, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1174, decode.acc_seg: 94.9547, aux.loss_ce: 0.0698, aux.acc_seg: 92.7258, loss: 0.1872, grad_norm: 2.2139
2023-02-19 12:42:01,164 - mmseg - INFO - Iter [108800/160000]	lr: 1.920e-05, eta: 4:03:50, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1104, decode.acc_seg: 95.2415, aux.loss_ce: 0.0650, aux.acc_seg: 93.0673, loss: 0.1753, grad_norm: 1.4221
2023-02-19 12:42:16,056 - mmseg - INFO - Iter [108850/160000]	lr: 1.918e-05, eta: 4:03:36, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1068, decode.acc_seg: 95.3819, aux.loss_ce: 0.0626, aux.acc_seg: 93.4084, loss: 0.1694, grad_norm: 1.7866
2023-02-19 12:42:30,482 - mmseg - INFO - Iter [108900/160000]	lr: 1.916e-05, eta: 4:03:22, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1140, decode.acc_seg: 95.1088, aux.loss_ce: 0.0663, aux.acc_seg: 93.0003, loss: 0.1803, grad_norm: 1.7527
2023-02-19 12:42:44,826 - mmseg - INFO - Iter [108950/160000]	lr: 1.914e-05, eta: 4:03:08, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1065, decode.acc_seg: 95.1748, aux.loss_ce: 0.0644, aux.acc_seg: 92.9966, loss: 0.1709, grad_norm: 4.8180
2023-02-19 12:42:58,625 - mmseg - INFO - Saving checkpoint at 109000 iterations
2023-02-19 12:43:01,866 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:43:01,866 - mmseg - INFO - Iter [109000/160000]	lr: 1.913e-05, eta: 4:02:55, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1095, decode.acc_seg: 95.1801, aux.loss_ce: 0.0631, aux.acc_seg: 93.1868, loss: 0.1726, grad_norm: 1.6961
2023-02-19 12:43:15,540 - mmseg - INFO - Iter [109050/160000]	lr: 1.911e-05, eta: 4:02:40, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1084, decode.acc_seg: 95.3649, aux.loss_ce: 0.0630, aux.acc_seg: 93.2758, loss: 0.1715, grad_norm: 1.4195
2023-02-19 12:43:29,123 - mmseg - INFO - Iter [109100/160000]	lr: 1.909e-05, eta: 4:02:26, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1129, decode.acc_seg: 95.0988, aux.loss_ce: 0.0635, aux.acc_seg: 93.1921, loss: 0.1763, grad_norm: 1.4265
2023-02-19 12:43:42,745 - mmseg - INFO - Iter [109150/160000]	lr: 1.907e-05, eta: 4:02:11, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1068, decode.acc_seg: 95.3605, aux.loss_ce: 0.0630, aux.acc_seg: 93.3089, loss: 0.1698, grad_norm: 1.3940
2023-02-19 12:43:56,591 - mmseg - INFO - Iter [109200/160000]	lr: 1.905e-05, eta: 4:01:57, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1109, decode.acc_seg: 95.0388, aux.loss_ce: 0.0669, aux.acc_seg: 92.6992, loss: 0.1779, grad_norm: 1.8765
2023-02-19 12:44:10,267 - mmseg - INFO - Iter [109250/160000]	lr: 1.903e-05, eta: 4:01:42, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1122, decode.acc_seg: 95.0487, aux.loss_ce: 0.0668, aux.acc_seg: 92.7153, loss: 0.1790, grad_norm: 1.7159
2023-02-19 12:44:24,312 - mmseg - INFO - Iter [109300/160000]	lr: 1.901e-05, eta: 4:01:28, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1050, decode.acc_seg: 95.4675, aux.loss_ce: 0.0629, aux.acc_seg: 93.3157, loss: 0.1679, grad_norm: 1.6825
2023-02-19 12:44:38,517 - mmseg - INFO - Iter [109350/160000]	lr: 1.899e-05, eta: 4:01:13, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1062, decode.acc_seg: 95.4036, aux.loss_ce: 0.0614, aux.acc_seg: 93.5506, loss: 0.1675, grad_norm: 1.7169
2023-02-19 12:44:52,115 - mmseg - INFO - Iter [109400/160000]	lr: 1.898e-05, eta: 4:00:59, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1095, decode.acc_seg: 95.0748, aux.loss_ce: 0.0648, aux.acc_seg: 92.8779, loss: 0.1743, grad_norm: 1.5772
2023-02-19 12:45:05,724 - mmseg - INFO - Iter [109450/160000]	lr: 1.896e-05, eta: 4:00:44, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1127, decode.acc_seg: 95.1021, aux.loss_ce: 0.0682, aux.acc_seg: 92.7249, loss: 0.1809, grad_norm: 1.6028
2023-02-19 12:45:19,582 - mmseg - INFO - Iter [109500/160000]	lr: 1.894e-05, eta: 4:00:30, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1102, decode.acc_seg: 95.2463, aux.loss_ce: 0.0648, aux.acc_seg: 93.1468, loss: 0.1750, grad_norm: 1.4012
2023-02-19 12:45:33,366 - mmseg - INFO - Iter [109550/160000]	lr: 1.892e-05, eta: 4:00:15, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1094, decode.acc_seg: 95.1433, aux.loss_ce: 0.0647, aux.acc_seg: 92.9703, loss: 0.1741, grad_norm: 1.6752
2023-02-19 12:45:47,252 - mmseg - INFO - Iter [109600/160000]	lr: 1.890e-05, eta: 4:00:01, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1005, decode.acc_seg: 95.5196, aux.loss_ce: 0.0602, aux.acc_seg: 93.4279, loss: 0.1607, grad_norm: 1.1331
2023-02-19 12:46:01,230 - mmseg - INFO - Iter [109650/160000]	lr: 1.888e-05, eta: 3:59:46, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1130, decode.acc_seg: 95.0782, aux.loss_ce: 0.0667, aux.acc_seg: 92.8717, loss: 0.1796, grad_norm: 1.4000
2023-02-19 12:46:16,431 - mmseg - INFO - Iter [109700/160000]	lr: 1.886e-05, eta: 3:59:32, time: 0.304, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1174, decode.acc_seg: 94.8818, aux.loss_ce: 0.0656, aux.acc_seg: 92.9739, loss: 0.1830, grad_norm: 1.6940
2023-02-19 12:46:30,205 - mmseg - INFO - Iter [109750/160000]	lr: 1.884e-05, eta: 3:59:18, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1152, decode.acc_seg: 95.1381, aux.loss_ce: 0.0673, aux.acc_seg: 92.9499, loss: 0.1825, grad_norm: 1.5140
2023-02-19 12:46:44,688 - mmseg - INFO - Iter [109800/160000]	lr: 1.883e-05, eta: 3:59:04, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1086, decode.acc_seg: 95.2492, aux.loss_ce: 0.0626, aux.acc_seg: 93.2513, loss: 0.1712, grad_norm: 1.3501
2023-02-19 12:46:58,455 - mmseg - INFO - Iter [109850/160000]	lr: 1.881e-05, eta: 3:58:49, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1136, decode.acc_seg: 94.9634, aux.loss_ce: 0.0659, aux.acc_seg: 92.8855, loss: 0.1795, grad_norm: 1.4209
2023-02-19 12:47:14,614 - mmseg - INFO - Iter [109900/160000]	lr: 1.879e-05, eta: 3:58:36, time: 0.323, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1133, decode.acc_seg: 94.9523, aux.loss_ce: 0.0640, aux.acc_seg: 92.9837, loss: 0.1773, grad_norm: 1.5062
2023-02-19 12:47:28,769 - mmseg - INFO - Iter [109950/160000]	lr: 1.877e-05, eta: 3:58:21, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1075, decode.acc_seg: 95.2988, aux.loss_ce: 0.0633, aux.acc_seg: 93.2261, loss: 0.1708, grad_norm: 1.4050
2023-02-19 12:47:42,501 - mmseg - INFO - Saving checkpoint at 110000 iterations
2023-02-19 12:47:45,790 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:47:45,790 - mmseg - INFO - Iter [110000/160000]	lr: 1.875e-05, eta: 3:58:08, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1090, decode.acc_seg: 95.2376, aux.loss_ce: 0.0639, aux.acc_seg: 93.2054, loss: 0.1729, grad_norm: 1.4443
2023-02-19 12:47:59,845 - mmseg - INFO - Iter [110050/160000]	lr: 1.873e-05, eta: 3:57:54, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1071, decode.acc_seg: 95.2997, aux.loss_ce: 0.0622, aux.acc_seg: 93.2444, loss: 0.1693, grad_norm: 2.2158
2023-02-19 12:48:13,872 - mmseg - INFO - Iter [110100/160000]	lr: 1.871e-05, eta: 3:57:39, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1137, decode.acc_seg: 94.9922, aux.loss_ce: 0.0686, aux.acc_seg: 92.6982, loss: 0.1823, grad_norm: 1.6027
2023-02-19 12:48:27,506 - mmseg - INFO - Iter [110150/160000]	lr: 1.869e-05, eta: 3:57:25, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1043, decode.acc_seg: 95.4969, aux.loss_ce: 0.0600, aux.acc_seg: 93.6618, loss: 0.1643, grad_norm: 1.1550
2023-02-19 12:48:41,778 - mmseg - INFO - Iter [110200/160000]	lr: 1.868e-05, eta: 3:57:11, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1089, decode.acc_seg: 95.1681, aux.loss_ce: 0.0642, aux.acc_seg: 93.1126, loss: 0.1730, grad_norm: 1.5597
2023-02-19 12:48:55,689 - mmseg - INFO - Iter [110250/160000]	lr: 1.866e-05, eta: 3:56:56, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1021, decode.acc_seg: 95.4795, aux.loss_ce: 0.0591, aux.acc_seg: 93.6108, loss: 0.1613, grad_norm: 1.2603
2023-02-19 12:49:09,522 - mmseg - INFO - Iter [110300/160000]	lr: 1.864e-05, eta: 3:56:42, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1079, decode.acc_seg: 95.2000, aux.loss_ce: 0.0660, aux.acc_seg: 92.7753, loss: 0.1740, grad_norm: 1.5607
2023-02-19 12:49:23,428 - mmseg - INFO - Iter [110350/160000]	lr: 1.862e-05, eta: 3:56:27, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1114, decode.acc_seg: 95.2219, aux.loss_ce: 0.0666, aux.acc_seg: 93.0577, loss: 0.1780, grad_norm: 1.4051
2023-02-19 12:49:38,167 - mmseg - INFO - Iter [110400/160000]	lr: 1.860e-05, eta: 3:56:13, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1117, decode.acc_seg: 95.0796, aux.loss_ce: 0.0656, aux.acc_seg: 92.9546, loss: 0.1773, grad_norm: 1.2895
2023-02-19 12:49:51,976 - mmseg - INFO - Iter [110450/160000]	lr: 1.858e-05, eta: 3:55:59, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1088, decode.acc_seg: 95.2104, aux.loss_ce: 0.0654, aux.acc_seg: 93.0070, loss: 0.1742, grad_norm: 1.9111
2023-02-19 12:50:05,970 - mmseg - INFO - Iter [110500/160000]	lr: 1.856e-05, eta: 3:55:44, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1099, decode.acc_seg: 95.1891, aux.loss_ce: 0.0657, aux.acc_seg: 93.1228, loss: 0.1756, grad_norm: 1.8876
2023-02-19 12:50:20,080 - mmseg - INFO - Iter [110550/160000]	lr: 1.854e-05, eta: 3:55:30, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1089, decode.acc_seg: 95.2359, aux.loss_ce: 0.0635, aux.acc_seg: 93.1160, loss: 0.1723, grad_norm: 1.3450
2023-02-19 12:50:34,329 - mmseg - INFO - Iter [110600/160000]	lr: 1.853e-05, eta: 3:55:16, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1094, decode.acc_seg: 95.0887, aux.loss_ce: 0.0663, aux.acc_seg: 92.8458, loss: 0.1757, grad_norm: 1.7319
2023-02-19 12:50:48,793 - mmseg - INFO - Iter [110650/160000]	lr: 1.851e-05, eta: 3:55:01, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1048, decode.acc_seg: 95.4259, aux.loss_ce: 0.0638, aux.acc_seg: 93.2125, loss: 0.1686, grad_norm: 1.8073
2023-02-19 12:51:03,484 - mmseg - INFO - Iter [110700/160000]	lr: 1.849e-05, eta: 3:54:47, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1175, decode.acc_seg: 94.7982, aux.loss_ce: 0.0681, aux.acc_seg: 92.7203, loss: 0.1856, grad_norm: 1.6091
2023-02-19 12:51:17,310 - mmseg - INFO - Iter [110750/160000]	lr: 1.847e-05, eta: 3:54:33, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1138, decode.acc_seg: 95.0550, aux.loss_ce: 0.0655, aux.acc_seg: 93.0500, loss: 0.1793, grad_norm: 1.6078
2023-02-19 12:51:31,490 - mmseg - INFO - Iter [110800/160000]	lr: 1.845e-05, eta: 3:54:18, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1087, decode.acc_seg: 95.2606, aux.loss_ce: 0.0675, aux.acc_seg: 92.7409, loss: 0.1762, grad_norm: 1.7386
2023-02-19 12:51:45,061 - mmseg - INFO - Iter [110850/160000]	lr: 1.843e-05, eta: 3:54:04, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1139, decode.acc_seg: 95.0832, aux.loss_ce: 0.0702, aux.acc_seg: 92.5113, loss: 0.1841, grad_norm: 1.5455
2023-02-19 12:51:59,283 - mmseg - INFO - Iter [110900/160000]	lr: 1.841e-05, eta: 3:53:49, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1124, decode.acc_seg: 95.1192, aux.loss_ce: 0.0662, aux.acc_seg: 92.9395, loss: 0.1786, grad_norm: 1.6649
2023-02-19 12:52:13,119 - mmseg - INFO - Iter [110950/160000]	lr: 1.839e-05, eta: 3:53:35, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1055, decode.acc_seg: 95.3254, aux.loss_ce: 0.0658, aux.acc_seg: 92.9985, loss: 0.1713, grad_norm: 1.6176
2023-02-19 12:52:26,870 - mmseg - INFO - Saving checkpoint at 111000 iterations
2023-02-19 12:52:30,206 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:52:30,206 - mmseg - INFO - Iter [111000/160000]	lr: 1.838e-05, eta: 3:53:22, time: 0.342, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1054, decode.acc_seg: 95.3439, aux.loss_ce: 0.0624, aux.acc_seg: 93.2022, loss: 0.1678, grad_norm: 1.3180
2023-02-19 12:52:43,875 - mmseg - INFO - Iter [111050/160000]	lr: 1.836e-05, eta: 3:53:07, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1152, decode.acc_seg: 95.0261, aux.loss_ce: 0.0654, aux.acc_seg: 93.0298, loss: 0.1805, grad_norm: 1.4735
2023-02-19 12:52:57,580 - mmseg - INFO - Iter [111100/160000]	lr: 1.834e-05, eta: 3:52:53, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1068, decode.acc_seg: 95.3411, aux.loss_ce: 0.0629, aux.acc_seg: 93.3413, loss: 0.1698, grad_norm: 1.3968
2023-02-19 12:53:13,367 - mmseg - INFO - Iter [111150/160000]	lr: 1.832e-05, eta: 3:52:39, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1143, decode.acc_seg: 95.0381, aux.loss_ce: 0.0664, aux.acc_seg: 93.0703, loss: 0.1807, grad_norm: 1.4087
2023-02-19 12:53:27,190 - mmseg - INFO - Iter [111200/160000]	lr: 1.830e-05, eta: 3:52:25, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1116, decode.acc_seg: 95.1838, aux.loss_ce: 0.0668, aux.acc_seg: 92.8973, loss: 0.1784, grad_norm: 1.5260
2023-02-19 12:53:40,988 - mmseg - INFO - Iter [111250/160000]	lr: 1.828e-05, eta: 3:52:10, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1073, decode.acc_seg: 95.1978, aux.loss_ce: 0.0631, aux.acc_seg: 93.1477, loss: 0.1704, grad_norm: 1.5562
2023-02-19 12:53:54,809 - mmseg - INFO - Iter [111300/160000]	lr: 1.826e-05, eta: 3:51:56, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1054, decode.acc_seg: 95.3682, aux.loss_ce: 0.0625, aux.acc_seg: 93.3082, loss: 0.1680, grad_norm: 1.5475
2023-02-19 12:54:08,885 - mmseg - INFO - Iter [111350/160000]	lr: 1.824e-05, eta: 3:51:41, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1080, decode.acc_seg: 95.0522, aux.loss_ce: 0.0638, aux.acc_seg: 92.9400, loss: 0.1718, grad_norm: 1.7631
2023-02-19 12:54:23,314 - mmseg - INFO - Iter [111400/160000]	lr: 1.823e-05, eta: 3:51:27, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1110, decode.acc_seg: 95.1598, aux.loss_ce: 0.0657, aux.acc_seg: 92.9409, loss: 0.1767, grad_norm: 2.1150
2023-02-19 12:54:37,934 - mmseg - INFO - Iter [111450/160000]	lr: 1.821e-05, eta: 3:51:13, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1052, decode.acc_seg: 95.3626, aux.loss_ce: 0.0632, aux.acc_seg: 93.2618, loss: 0.1683, grad_norm: 1.2754
2023-02-19 12:54:52,319 - mmseg - INFO - Iter [111500/160000]	lr: 1.819e-05, eta: 3:50:59, time: 0.288, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1055, decode.acc_seg: 95.3619, aux.loss_ce: 0.0639, aux.acc_seg: 93.1356, loss: 0.1694, grad_norm: 1.3717
2023-02-19 12:55:06,372 - mmseg - INFO - Iter [111550/160000]	lr: 1.817e-05, eta: 3:50:44, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1135, decode.acc_seg: 95.1026, aux.loss_ce: 0.0661, aux.acc_seg: 93.0228, loss: 0.1796, grad_norm: 1.7383
2023-02-19 12:55:20,461 - mmseg - INFO - Iter [111600/160000]	lr: 1.815e-05, eta: 3:50:30, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1088, decode.acc_seg: 95.1955, aux.loss_ce: 0.0648, aux.acc_seg: 92.9468, loss: 0.1736, grad_norm: 1.4136
2023-02-19 12:55:34,566 - mmseg - INFO - Iter [111650/160000]	lr: 1.813e-05, eta: 3:50:16, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1096, decode.acc_seg: 95.3312, aux.loss_ce: 0.0655, aux.acc_seg: 93.2195, loss: 0.1751, grad_norm: 1.5995
2023-02-19 12:55:48,305 - mmseg - INFO - Iter [111700/160000]	lr: 1.811e-05, eta: 3:50:01, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1058, decode.acc_seg: 95.3421, aux.loss_ce: 0.0633, aux.acc_seg: 93.1932, loss: 0.1691, grad_norm: 1.3803
2023-02-19 12:56:02,374 - mmseg - INFO - Iter [111750/160000]	lr: 1.809e-05, eta: 3:49:47, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1168, decode.acc_seg: 94.9336, aux.loss_ce: 0.0694, aux.acc_seg: 92.6879, loss: 0.1862, grad_norm: 2.1798
2023-02-19 12:56:16,833 - mmseg - INFO - Iter [111800/160000]	lr: 1.808e-05, eta: 3:49:32, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1033, decode.acc_seg: 95.4520, aux.loss_ce: 0.0606, aux.acc_seg: 93.4799, loss: 0.1639, grad_norm: 1.6740
2023-02-19 12:56:30,540 - mmseg - INFO - Iter [111850/160000]	lr: 1.806e-05, eta: 3:49:18, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1134, decode.acc_seg: 94.9220, aux.loss_ce: 0.0677, aux.acc_seg: 92.6500, loss: 0.1811, grad_norm: 1.5769
2023-02-19 12:56:44,484 - mmseg - INFO - Iter [111900/160000]	lr: 1.804e-05, eta: 3:49:03, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1146, decode.acc_seg: 94.9707, aux.loss_ce: 0.0676, aux.acc_seg: 92.7062, loss: 0.1822, grad_norm: 1.5964
2023-02-19 12:56:58,103 - mmseg - INFO - Iter [111950/160000]	lr: 1.802e-05, eta: 3:48:49, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1086, decode.acc_seg: 95.3086, aux.loss_ce: 0.0649, aux.acc_seg: 93.1324, loss: 0.1735, grad_norm: 1.5444
2023-02-19 12:57:13,077 - mmseg - INFO - Saving checkpoint at 112000 iterations
2023-02-19 12:57:16,353 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:57:16,353 - mmseg - INFO - Iter [112000/160000]	lr: 1.800e-05, eta: 3:48:36, time: 0.365, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1102, decode.acc_seg: 95.1273, aux.loss_ce: 0.0630, aux.acc_seg: 93.2120, loss: 0.1733, grad_norm: 1.3785
2023-02-19 12:57:30,683 - mmseg - INFO - per class results:
2023-02-19 12:57:30,689 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        | 79.08 | 87.35 |
|       building      | 82.97 | 92.24 |
|         sky         | 94.39 | 97.89 |
|        floor        | 81.11 | 91.49 |
|         tree        | 75.38 | 90.29 |
|       ceiling       | 84.27 | 95.38 |
|         road        | 84.04 | 91.06 |
|         bed         | 90.49 | 96.38 |
|      windowpane     | 63.19 | 77.61 |
|        grass        | 67.36 | 81.33 |
|       cabinet       | 60.27 | 72.75 |
|       sidewalk      | 69.64 |  86.2 |
|        person       | 82.63 | 92.86 |
|        earth        | 36.34 | 51.54 |
|         door        | 53.66 | 66.65 |
|        table        | 63.66 | 74.68 |
|       mountain      | 60.23 | 69.82 |
|        plant        | 51.72 | 61.85 |
|       curtain       | 76.46 |  87.4 |
|        chair        | 63.69 | 78.75 |
|         car         | 85.12 | 90.95 |
|        water        | 60.48 | 75.53 |
|       painting      | 77.11 | 90.28 |
|         sofa        | 73.88 | 89.73 |
|        shelf        | 45.46 | 64.69 |
|        house        | 40.22 | 46.06 |
|         sea         |  64.2 |  84.5 |
|        mirror       | 71.91 | 81.87 |
|         rug         | 50.33 | 58.23 |
|        field        | 31.98 | 48.56 |
|       armchair      | 51.43 | 65.22 |
|         seat        | 59.99 | 88.69 |
|        fence        | 45.47 | 58.34 |
|         desk        | 55.13 | 72.83 |
|         rock        | 54.54 |  79.2 |
|       wardrobe      | 42.69 | 64.69 |
|         lamp        | 66.96 | 77.57 |
|       bathtub       | 81.66 | 85.43 |
|       railing       | 37.73 | 57.58 |
|       cushion       | 61.36 | 80.82 |
|         base        | 40.77 | 50.41 |
|         box         | 31.31 | 41.54 |
|        column       | 48.19 | 69.75 |
|      signboard      | 40.09 |  56.8 |
|   chest of drawers  | 38.33 | 57.69 |
|       counter       | 24.71 | 31.01 |
|         sand        |  51.8 | 73.65 |
|         sink        | 74.04 | 81.62 |
|      skyscraper     | 50.16 | 61.56 |
|      fireplace      | 72.91 | 93.27 |
|     refrigerator    | 82.73 | 93.25 |
|      grandstand     | 41.48 | 71.94 |
|         path        | 24.53 |  34.4 |
|        stairs       | 28.74 | 36.43 |
|        runway       | 68.39 | 89.99 |
|         case        | 49.66 | 74.52 |
|      pool table     |  93.2 | 97.35 |
|        pillow       | 52.97 | 59.02 |
|     screen door     | 77.32 | 79.85 |
|       stairway      |  30.8 | 44.65 |
|        river        | 10.48 |  19.0 |
|        bridge       | 67.34 | 77.74 |
|       bookcase      | 45.46 | 66.57 |
|        blind        | 44.81 | 49.47 |
|     coffee table    | 60.85 | 82.06 |
|        toilet       | 86.68 | 91.62 |
|        flower       | 42.55 |  60.3 |
|         book        | 43.71 | 66.84 |
|         hill        | 12.76 | 21.12 |
|        bench        | 47.71 |  52.5 |
|      countertop     | 55.49 | 77.97 |
|        stove        |  82.5 | 85.79 |
|         palm        | 57.91 | 75.73 |
|    kitchen island   | 47.84 | 75.07 |
|       computer      | 77.53 |  87.0 |
|     swivel chair    | 43.13 | 59.85 |
|         boat        | 51.59 | 58.75 |
|         bar         | 48.81 | 63.92 |
|    arcade machine   | 47.07 | 49.34 |
|        hovel        |  44.8 | 50.65 |
|         bus         | 87.19 | 96.63 |
|        towel        | 72.27 | 85.53 |
|        light        | 57.77 | 66.34 |
|        truck        | 39.98 | 50.18 |
|        tower        | 35.63 | 46.35 |
|      chandelier     | 69.27 | 87.19 |
|        awning       | 36.24 | 41.45 |
|     streetlight     | 35.37 | 43.55 |
|        booth        | 44.36 |  48.8 |
| television receiver | 73.76 | 82.05 |
|       airplane      | 61.95 | 66.33 |
|      dirt track     |  6.87 | 28.79 |
|       apparel       | 44.94 | 61.07 |
|         pole        | 29.12 | 54.17 |
|         land        |  5.38 |  7.39 |
|      bannister      | 15.52 | 25.49 |
|      escalator      | 44.88 | 62.15 |
|       ottoman       | 47.17 | 72.81 |
|        bottle       | 35.65 | 60.86 |
|        buffet       | 35.09 | 37.89 |
|        poster       | 30.52 | 43.99 |
|        stage        | 20.52 | 28.88 |
|         van         | 35.49 | 48.34 |
|         ship        | 39.64 | 56.27 |
|       fountain      | 26.22 | 26.52 |
|    conveyer belt    | 72.52 | 95.35 |
|        canopy       | 44.72 | 60.93 |
|        washer       | 72.45 | 75.98 |
|      plaything      | 36.44 | 51.21 |
|    swimming pool    | 52.67 | 68.21 |
|        stool        | 50.35 | 70.81 |
|        barrel       | 29.04 | 74.64 |
|        basket       | 40.25 | 65.84 |
|      waterfall      | 49.07 | 60.39 |
|         tent        | 92.44 | 98.02 |
|         bag         | 18.23 | 23.94 |
|       minibike      | 67.88 | 92.33 |
|        cradle       | 80.07 | 90.05 |
|         oven        | 35.74 | 65.48 |
|         ball        | 56.58 | 66.14 |
|         food        | 52.97 | 59.22 |
|         step        |  8.56 |  10.2 |
|         tank        | 59.74 | 61.56 |
|      trade name     | 18.47 | 22.51 |
|      microwave      |  65.0 |  72.5 |
|         pot         |  52.5 | 61.72 |
|        animal       | 64.88 | 68.58 |
|       bicycle       | 60.79 | 79.03 |
|         lake        | 47.73 | 57.71 |
|      dishwasher     |  66.9 | 74.41 |
|        screen       |  59.5 | 75.06 |
|       blanket       | 23.86 | 27.93 |
|      sculpture      | 74.65 | 85.17 |
|         hood        | 73.23 | 77.97 |
|        sconce       | 52.36 | 64.96 |
|         vase        | 40.25 | 53.39 |
|    traffic light    | 38.27 | 61.56 |
|         tray        | 16.66 | 30.97 |
|        ashcan       | 37.86 | 50.97 |
|         fan         | 69.18 | 80.18 |
|         pier        | 24.39 | 54.07 |
|      crt screen     |  2.69 |  7.71 |
|        plate        | 57.89 | 77.18 |
|       monitor       |  2.25 |  2.56 |
|    bulletin board   | 52.54 | 57.17 |
|        shower       |  6.42 | 22.72 |
|       radiator      | 70.06 | 87.99 |
|        glass        | 17.86 | 20.72 |
|        clock        | 47.19 | 51.55 |
|         flag        | 60.38 | 80.67 |
+---------------------+-------+-------+
2023-02-19 12:57:30,689 - mmseg - INFO - Summary:
2023-02-19 12:57:30,689 - mmseg - INFO - 
+-------+-------+-------+
|  aAcc |  mIoU |  mAcc |
+-------+-------+-------+
| 83.97 | 51.82 | 64.67 |
+-------+-------+-------+
2023-02-19 12:57:30,690 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 12:57:30,690 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8397, mIoU: 0.5182, mAcc: 0.6467, IoU.wall: 0.7908, IoU.building: 0.8297, IoU.sky: 0.9439, IoU.floor: 0.8111, IoU.tree: 0.7538, IoU.ceiling: 0.8427, IoU.road: 0.8404, IoU.bed : 0.9049, IoU.windowpane: 0.6319, IoU.grass: 0.6736, IoU.cabinet: 0.6027, IoU.sidewalk: 0.6964, IoU.person: 0.8263, IoU.earth: 0.3634, IoU.door: 0.5366, IoU.table: 0.6366, IoU.mountain: 0.6023, IoU.plant: 0.5172, IoU.curtain: 0.7646, IoU.chair: 0.6369, IoU.car: 0.8512, IoU.water: 0.6048, IoU.painting: 0.7711, IoU.sofa: 0.7388, IoU.shelf: 0.4546, IoU.house: 0.4022, IoU.sea: 0.6420, IoU.mirror: 0.7191, IoU.rug: 0.5033, IoU.field: 0.3198, IoU.armchair: 0.5143, IoU.seat: 0.5999, IoU.fence: 0.4547, IoU.desk: 0.5513, IoU.rock: 0.5454, IoU.wardrobe: 0.4269, IoU.lamp: 0.6696, IoU.bathtub: 0.8166, IoU.railing: 0.3773, IoU.cushion: 0.6136, IoU.base: 0.4077, IoU.box: 0.3131, IoU.column: 0.4819, IoU.signboard: 0.4009, IoU.chest of drawers: 0.3833, IoU.counter: 0.2471, IoU.sand: 0.5180, IoU.sink: 0.7404, IoU.skyscraper: 0.5016, IoU.fireplace: 0.7291, IoU.refrigerator: 0.8273, IoU.grandstand: 0.4148, IoU.path: 0.2453, IoU.stairs: 0.2874, IoU.runway: 0.6839, IoU.case: 0.4966, IoU.pool table: 0.9320, IoU.pillow: 0.5297, IoU.screen door: 0.7732, IoU.stairway: 0.3080, IoU.river: 0.1048, IoU.bridge: 0.6734, IoU.bookcase: 0.4546, IoU.blind: 0.4481, IoU.coffee table: 0.6085, IoU.toilet: 0.8668, IoU.flower: 0.4255, IoU.book: 0.4371, IoU.hill: 0.1276, IoU.bench: 0.4771, IoU.countertop: 0.5549, IoU.stove: 0.8250, IoU.palm: 0.5791, IoU.kitchen island: 0.4784, IoU.computer: 0.7753, IoU.swivel chair: 0.4313, IoU.boat: 0.5159, IoU.bar: 0.4881, IoU.arcade machine: 0.4707, IoU.hovel: 0.4480, IoU.bus: 0.8719, IoU.towel: 0.7227, IoU.light: 0.5777, IoU.truck: 0.3998, IoU.tower: 0.3563, IoU.chandelier: 0.6927, IoU.awning: 0.3624, IoU.streetlight: 0.3537, IoU.booth: 0.4436, IoU.television receiver: 0.7376, IoU.airplane: 0.6195, IoU.dirt track: 0.0687, IoU.apparel: 0.4494, IoU.pole: 0.2912, IoU.land: 0.0538, IoU.bannister: 0.1552, IoU.escalator: 0.4488, IoU.ottoman: 0.4717, IoU.bottle: 0.3565, IoU.buffet: 0.3509, IoU.poster: 0.3052, IoU.stage: 0.2052, IoU.van: 0.3549, IoU.ship: 0.3964, IoU.fountain: 0.2622, IoU.conveyer belt: 0.7252, IoU.canopy: 0.4472, IoU.washer: 0.7245, IoU.plaything: 0.3644, IoU.swimming pool: 0.5267, IoU.stool: 0.5035, IoU.barrel: 0.2904, IoU.basket: 0.4025, IoU.waterfall: 0.4907, IoU.tent: 0.9244, IoU.bag: 0.1823, IoU.minibike: 0.6788, IoU.cradle: 0.8007, IoU.oven: 0.3574, IoU.ball: 0.5658, IoU.food: 0.5297, IoU.step: 0.0856, IoU.tank: 0.5974, IoU.trade name: 0.1847, IoU.microwave: 0.6500, IoU.pot: 0.5250, IoU.animal: 0.6488, IoU.bicycle: 0.6079, IoU.lake: 0.4773, IoU.dishwasher: 0.6690, IoU.screen: 0.5950, IoU.blanket: 0.2386, IoU.sculpture: 0.7465, IoU.hood: 0.7323, IoU.sconce: 0.5236, IoU.vase: 0.4025, IoU.traffic light: 0.3827, IoU.tray: 0.1666, IoU.ashcan: 0.3786, IoU.fan: 0.6918, IoU.pier: 0.2439, IoU.crt screen: 0.0269, IoU.plate: 0.5789, IoU.monitor: 0.0225, IoU.bulletin board: 0.5254, IoU.shower: 0.0642, IoU.radiator: 0.7006, IoU.glass: 0.1786, IoU.clock: 0.4719, IoU.flag: 0.6038, Acc.wall: 0.8735, Acc.building: 0.9224, Acc.sky: 0.9789, Acc.floor: 0.9149, Acc.tree: 0.9029, Acc.ceiling: 0.9538, Acc.road: 0.9106, Acc.bed : 0.9638, Acc.windowpane: 0.7761, Acc.grass: 0.8133, Acc.cabinet: 0.7275, Acc.sidewalk: 0.8620, Acc.person: 0.9286, Acc.earth: 0.5154, Acc.door: 0.6665, Acc.table: 0.7468, Acc.mountain: 0.6982, Acc.plant: 0.6185, Acc.curtain: 0.8740, Acc.chair: 0.7875, Acc.car: 0.9095, Acc.water: 0.7553, Acc.painting: 0.9028, Acc.sofa: 0.8973, Acc.shelf: 0.6469, Acc.house: 0.4606, Acc.sea: 0.8450, Acc.mirror: 0.8187, Acc.rug: 0.5823, Acc.field: 0.4856, Acc.armchair: 0.6522, Acc.seat: 0.8869, Acc.fence: 0.5834, Acc.desk: 0.7283, Acc.rock: 0.7920, Acc.wardrobe: 0.6469, Acc.lamp: 0.7757, Acc.bathtub: 0.8543, Acc.railing: 0.5758, Acc.cushion: 0.8082, Acc.base: 0.5041, Acc.box: 0.4154, Acc.column: 0.6975, Acc.signboard: 0.5680, Acc.chest of drawers: 0.5769, Acc.counter: 0.3101, Acc.sand: 0.7365, Acc.sink: 0.8162, Acc.skyscraper: 0.6156, Acc.fireplace: 0.9327, Acc.refrigerator: 0.9325, Acc.grandstand: 0.7194, Acc.path: 0.3440, Acc.stairs: 0.3643, Acc.runway: 0.8999, Acc.case: 0.7452, Acc.pool table: 0.9735, Acc.pillow: 0.5902, Acc.screen door: 0.7985, Acc.stairway: 0.4465, Acc.river: 0.1900, Acc.bridge: 0.7774, Acc.bookcase: 0.6657, Acc.blind: 0.4947, Acc.coffee table: 0.8206, Acc.toilet: 0.9162, Acc.flower: 0.6030, Acc.book: 0.6684, Acc.hill: 0.2112, Acc.bench: 0.5250, Acc.countertop: 0.7797, Acc.stove: 0.8579, Acc.palm: 0.7573, Acc.kitchen island: 0.7507, Acc.computer: 0.8700, Acc.swivel chair: 0.5985, Acc.boat: 0.5875, Acc.bar: 0.6392, Acc.arcade machine: 0.4934, Acc.hovel: 0.5065, Acc.bus: 0.9663, Acc.towel: 0.8553, Acc.light: 0.6634, Acc.truck: 0.5018, Acc.tower: 0.4635, Acc.chandelier: 0.8719, Acc.awning: 0.4145, Acc.streetlight: 0.4355, Acc.booth: 0.4880, Acc.television receiver: 0.8205, Acc.airplane: 0.6633, Acc.dirt track: 0.2879, Acc.apparel: 0.6107, Acc.pole: 0.5417, Acc.land: 0.0739, Acc.bannister: 0.2549, Acc.escalator: 0.6215, Acc.ottoman: 0.7281, Acc.bottle: 0.6086, Acc.buffet: 0.3789, Acc.poster: 0.4399, Acc.stage: 0.2888, Acc.van: 0.4834, Acc.ship: 0.5627, Acc.fountain: 0.2652, Acc.conveyer belt: 0.9535, Acc.canopy: 0.6093, Acc.washer: 0.7598, Acc.plaything: 0.5121, Acc.swimming pool: 0.6821, Acc.stool: 0.7081, Acc.barrel: 0.7464, Acc.basket: 0.6584, Acc.waterfall: 0.6039, Acc.tent: 0.9802, Acc.bag: 0.2394, Acc.minibike: 0.9233, Acc.cradle: 0.9005, Acc.oven: 0.6548, Acc.ball: 0.6614, Acc.food: 0.5922, Acc.step: 0.1020, Acc.tank: 0.6156, Acc.trade name: 0.2251, Acc.microwave: 0.7250, Acc.pot: 0.6172, Acc.animal: 0.6858, Acc.bicycle: 0.7903, Acc.lake: 0.5771, Acc.dishwasher: 0.7441, Acc.screen: 0.7506, Acc.blanket: 0.2793, Acc.sculpture: 0.8517, Acc.hood: 0.7797, Acc.sconce: 0.6496, Acc.vase: 0.5339, Acc.traffic light: 0.6156, Acc.tray: 0.3097, Acc.ashcan: 0.5097, Acc.fan: 0.8018, Acc.pier: 0.5407, Acc.crt screen: 0.0771, Acc.plate: 0.7718, Acc.monitor: 0.0256, Acc.bulletin board: 0.5717, Acc.shower: 0.2272, Acc.radiator: 0.8799, Acc.glass: 0.2072, Acc.clock: 0.5155, Acc.flag: 0.8067
2023-02-19 12:57:44,312 - mmseg - INFO - Iter [112050/160000]	lr: 1.798e-05, eta: 3:48:28, time: 0.559, data_time: 0.291, memory: 15214, decode.loss_ce: 0.1059, decode.acc_seg: 95.2641, aux.loss_ce: 0.0640, aux.acc_seg: 93.0902, loss: 0.1699, grad_norm: 1.4473
2023-02-19 12:57:57,989 - mmseg - INFO - Iter [112100/160000]	lr: 1.796e-05, eta: 3:48:13, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1135, decode.acc_seg: 95.0040, aux.loss_ce: 0.0678, aux.acc_seg: 92.7522, loss: 0.1814, grad_norm: 1.8409
2023-02-19 12:58:11,679 - mmseg - INFO - Iter [112150/160000]	lr: 1.794e-05, eta: 3:47:59, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1024, decode.acc_seg: 95.4152, aux.loss_ce: 0.0632, aux.acc_seg: 93.1533, loss: 0.1656, grad_norm: 1.4829
2023-02-19 12:58:25,295 - mmseg - INFO - Iter [112200/160000]	lr: 1.793e-05, eta: 3:47:44, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1140, decode.acc_seg: 95.0048, aux.loss_ce: 0.0655, aux.acc_seg: 92.9538, loss: 0.1795, grad_norm: 1.5790
2023-02-19 12:58:39,618 - mmseg - INFO - Iter [112250/160000]	lr: 1.791e-05, eta: 3:47:30, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1067, decode.acc_seg: 95.2535, aux.loss_ce: 0.0648, aux.acc_seg: 92.8209, loss: 0.1715, grad_norm: 1.8300
2023-02-19 12:58:53,310 - mmseg - INFO - Iter [112300/160000]	lr: 1.789e-05, eta: 3:47:15, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1113, decode.acc_seg: 95.1433, aux.loss_ce: 0.0651, aux.acc_seg: 93.0721, loss: 0.1764, grad_norm: 2.2427
2023-02-19 12:59:07,547 - mmseg - INFO - Iter [112350/160000]	lr: 1.787e-05, eta: 3:47:01, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1079, decode.acc_seg: 95.3996, aux.loss_ce: 0.0643, aux.acc_seg: 93.3196, loss: 0.1722, grad_norm: 1.4878
2023-02-19 12:59:21,208 - mmseg - INFO - Iter [112400/160000]	lr: 1.785e-05, eta: 3:46:46, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1013, decode.acc_seg: 95.4825, aux.loss_ce: 0.0620, aux.acc_seg: 93.2723, loss: 0.1633, grad_norm: 2.1310
2023-02-19 12:59:37,120 - mmseg - INFO - Iter [112450/160000]	lr: 1.783e-05, eta: 3:46:33, time: 0.318, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1058, decode.acc_seg: 95.4282, aux.loss_ce: 0.0638, aux.acc_seg: 93.1953, loss: 0.1696, grad_norm: 1.3981
2023-02-19 12:59:51,023 - mmseg - INFO - Iter [112500/160000]	lr: 1.781e-05, eta: 3:46:18, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1167, decode.acc_seg: 94.9820, aux.loss_ce: 0.0676, aux.acc_seg: 92.7831, loss: 0.1843, grad_norm: 2.1713
2023-02-19 13:00:05,322 - mmseg - INFO - Iter [112550/160000]	lr: 1.779e-05, eta: 3:46:04, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1070, decode.acc_seg: 95.3133, aux.loss_ce: 0.0629, aux.acc_seg: 93.3452, loss: 0.1699, grad_norm: 1.2081
2023-02-19 13:00:18,897 - mmseg - INFO - Iter [112600/160000]	lr: 1.778e-05, eta: 3:45:50, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1077, decode.acc_seg: 95.3294, aux.loss_ce: 0.0640, aux.acc_seg: 93.2457, loss: 0.1716, grad_norm: 1.3674
2023-02-19 13:00:32,667 - mmseg - INFO - Iter [112650/160000]	lr: 1.776e-05, eta: 3:45:35, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1118, decode.acc_seg: 95.1159, aux.loss_ce: 0.0652, aux.acc_seg: 93.0185, loss: 0.1770, grad_norm: 1.3985
2023-02-19 13:00:47,156 - mmseg - INFO - Iter [112700/160000]	lr: 1.774e-05, eta: 3:45:21, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1055, decode.acc_seg: 95.2576, aux.loss_ce: 0.0635, aux.acc_seg: 93.0567, loss: 0.1691, grad_norm: 1.3414
2023-02-19 13:01:01,197 - mmseg - INFO - Iter [112750/160000]	lr: 1.772e-05, eta: 3:45:06, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1094, decode.acc_seg: 95.2540, aux.loss_ce: 0.0632, aux.acc_seg: 93.3113, loss: 0.1726, grad_norm: 1.4641
2023-02-19 13:01:15,623 - mmseg - INFO - Iter [112800/160000]	lr: 1.770e-05, eta: 3:44:52, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1119, decode.acc_seg: 94.9660, aux.loss_ce: 0.0650, aux.acc_seg: 92.9295, loss: 0.1770, grad_norm: 1.5381
2023-02-19 13:01:29,347 - mmseg - INFO - Iter [112850/160000]	lr: 1.768e-05, eta: 3:44:38, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1070, decode.acc_seg: 95.2042, aux.loss_ce: 0.0629, aux.acc_seg: 93.1761, loss: 0.1699, grad_norm: 1.3038
2023-02-19 13:01:43,009 - mmseg - INFO - Iter [112900/160000]	lr: 1.766e-05, eta: 3:44:23, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1067, decode.acc_seg: 95.2922, aux.loss_ce: 0.0626, aux.acc_seg: 93.2013, loss: 0.1693, grad_norm: 1.3874
2023-02-19 13:01:56,691 - mmseg - INFO - Iter [112950/160000]	lr: 1.764e-05, eta: 3:44:09, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1075, decode.acc_seg: 95.2692, aux.loss_ce: 0.0645, aux.acc_seg: 93.0913, loss: 0.1720, grad_norm: 1.5665
2023-02-19 13:02:11,148 - mmseg - INFO - Saving checkpoint at 113000 iterations
2023-02-19 13:02:14,477 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:02:14,477 - mmseg - INFO - Iter [113000/160000]	lr: 1.763e-05, eta: 3:43:56, time: 0.356, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1082, decode.acc_seg: 95.2581, aux.loss_ce: 0.0651, aux.acc_seg: 93.0435, loss: 0.1733, grad_norm: 1.7138
2023-02-19 13:02:28,931 - mmseg - INFO - Iter [113050/160000]	lr: 1.761e-05, eta: 3:43:41, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1032, decode.acc_seg: 95.4798, aux.loss_ce: 0.0617, aux.acc_seg: 93.3665, loss: 0.1650, grad_norm: 1.4444
2023-02-19 13:02:42,600 - mmseg - INFO - Iter [113100/160000]	lr: 1.759e-05, eta: 3:43:27, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1048, decode.acc_seg: 95.3301, aux.loss_ce: 0.0607, aux.acc_seg: 93.3572, loss: 0.1655, grad_norm: 1.3317
2023-02-19 13:02:56,620 - mmseg - INFO - Iter [113150/160000]	lr: 1.757e-05, eta: 3:43:13, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1085, decode.acc_seg: 95.3346, aux.loss_ce: 0.0651, aux.acc_seg: 93.0833, loss: 0.1736, grad_norm: 1.7380
2023-02-19 13:03:10,230 - mmseg - INFO - Iter [113200/160000]	lr: 1.755e-05, eta: 3:42:58, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1124, decode.acc_seg: 95.1997, aux.loss_ce: 0.0686, aux.acc_seg: 92.7634, loss: 0.1809, grad_norm: 1.7813
2023-02-19 13:03:24,103 - mmseg - INFO - Iter [113250/160000]	lr: 1.753e-05, eta: 3:42:43, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1093, decode.acc_seg: 95.2070, aux.loss_ce: 0.0651, aux.acc_seg: 93.0737, loss: 0.1744, grad_norm: 1.4410
2023-02-19 13:03:37,791 - mmseg - INFO - Iter [113300/160000]	lr: 1.751e-05, eta: 3:42:29, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1048, decode.acc_seg: 95.4773, aux.loss_ce: 0.0608, aux.acc_seg: 93.5636, loss: 0.1657, grad_norm: 1.5809
2023-02-19 13:03:51,394 - mmseg - INFO - Iter [113350/160000]	lr: 1.749e-05, eta: 3:42:14, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1083, decode.acc_seg: 95.3104, aux.loss_ce: 0.0632, aux.acc_seg: 93.3800, loss: 0.1715, grad_norm: 1.5169
2023-02-19 13:04:05,360 - mmseg - INFO - Iter [113400/160000]	lr: 1.748e-05, eta: 3:42:00, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1109, decode.acc_seg: 95.1428, aux.loss_ce: 0.0654, aux.acc_seg: 93.1022, loss: 0.1763, grad_norm: 1.7108
2023-02-19 13:04:19,076 - mmseg - INFO - Iter [113450/160000]	lr: 1.746e-05, eta: 3:41:45, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1097, decode.acc_seg: 95.2136, aux.loss_ce: 0.0646, aux.acc_seg: 93.0820, loss: 0.1742, grad_norm: 1.4150
2023-02-19 13:04:33,441 - mmseg - INFO - Iter [113500/160000]	lr: 1.744e-05, eta: 3:41:31, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1068, decode.acc_seg: 95.2392, aux.loss_ce: 0.0611, aux.acc_seg: 93.3934, loss: 0.1679, grad_norm: 1.5205
2023-02-19 13:04:47,284 - mmseg - INFO - Iter [113550/160000]	lr: 1.742e-05, eta: 3:41:17, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1037, decode.acc_seg: 95.3579, aux.loss_ce: 0.0627, aux.acc_seg: 93.1797, loss: 0.1664, grad_norm: 1.5483
2023-02-19 13:05:01,805 - mmseg - INFO - Iter [113600/160000]	lr: 1.740e-05, eta: 3:41:02, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1014, decode.acc_seg: 95.4705, aux.loss_ce: 0.0622, aux.acc_seg: 93.1369, loss: 0.1637, grad_norm: 1.1844
2023-02-19 13:05:16,272 - mmseg - INFO - Iter [113650/160000]	lr: 1.738e-05, eta: 3:40:48, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1100, decode.acc_seg: 95.2015, aux.loss_ce: 0.0650, aux.acc_seg: 93.0171, loss: 0.1750, grad_norm: 1.4059
2023-02-19 13:05:32,417 - mmseg - INFO - Iter [113700/160000]	lr: 1.736e-05, eta: 3:40:35, time: 0.323, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1099, decode.acc_seg: 95.2388, aux.loss_ce: 0.0656, aux.acc_seg: 93.0432, loss: 0.1755, grad_norm: 1.7346
2023-02-19 13:05:46,157 - mmseg - INFO - Iter [113750/160000]	lr: 1.734e-05, eta: 3:40:20, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1063, decode.acc_seg: 95.3661, aux.loss_ce: 0.0614, aux.acc_seg: 93.4068, loss: 0.1677, grad_norm: 1.5566
2023-02-19 13:06:00,137 - mmseg - INFO - Iter [113800/160000]	lr: 1.733e-05, eta: 3:40:06, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1095, decode.acc_seg: 95.2721, aux.loss_ce: 0.0639, aux.acc_seg: 93.2985, loss: 0.1733, grad_norm: 1.9986
2023-02-19 13:06:13,819 - mmseg - INFO - Iter [113850/160000]	lr: 1.731e-05, eta: 3:39:51, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1120, decode.acc_seg: 95.0984, aux.loss_ce: 0.0661, aux.acc_seg: 92.9415, loss: 0.1781, grad_norm: 1.8732
2023-02-19 13:06:27,459 - mmseg - INFO - Iter [113900/160000]	lr: 1.729e-05, eta: 3:39:37, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1090, decode.acc_seg: 95.2121, aux.loss_ce: 0.0652, aux.acc_seg: 93.0167, loss: 0.1742, grad_norm: 1.6011
2023-02-19 13:06:41,168 - mmseg - INFO - Iter [113950/160000]	lr: 1.727e-05, eta: 3:39:22, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0993, decode.acc_seg: 95.4167, aux.loss_ce: 0.0599, aux.acc_seg: 93.3030, loss: 0.1593, grad_norm: 1.2947
2023-02-19 13:06:55,145 - mmseg - INFO - Saving checkpoint at 114000 iterations
2023-02-19 13:06:58,448 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:06:58,448 - mmseg - INFO - Iter [114000/160000]	lr: 1.725e-05, eta: 3:39:09, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1109, decode.acc_seg: 95.1459, aux.loss_ce: 0.0637, aux.acc_seg: 93.1821, loss: 0.1746, grad_norm: 1.5731
2023-02-19 13:07:12,807 - mmseg - INFO - Iter [114050/160000]	lr: 1.723e-05, eta: 3:38:55, time: 0.287, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1063, decode.acc_seg: 95.3503, aux.loss_ce: 0.0615, aux.acc_seg: 93.2669, loss: 0.1678, grad_norm: 1.3683
2023-02-19 13:07:26,907 - mmseg - INFO - Iter [114100/160000]	lr: 1.721e-05, eta: 3:38:40, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1032, decode.acc_seg: 95.4457, aux.loss_ce: 0.0607, aux.acc_seg: 93.4631, loss: 0.1639, grad_norm: 1.1098
2023-02-19 13:07:40,613 - mmseg - INFO - Iter [114150/160000]	lr: 1.719e-05, eta: 3:38:26, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1109, decode.acc_seg: 95.1100, aux.loss_ce: 0.0672, aux.acc_seg: 92.8306, loss: 0.1781, grad_norm: 1.5382
2023-02-19 13:07:54,299 - mmseg - INFO - Iter [114200/160000]	lr: 1.718e-05, eta: 3:38:11, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1058, decode.acc_seg: 95.4589, aux.loss_ce: 0.0634, aux.acc_seg: 93.3112, loss: 0.1691, grad_norm: 1.3934
2023-02-19 13:08:07,851 - mmseg - INFO - Iter [114250/160000]	lr: 1.716e-05, eta: 3:37:57, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1036, decode.acc_seg: 95.3252, aux.loss_ce: 0.0609, aux.acc_seg: 93.2205, loss: 0.1645, grad_norm: 1.3756
2023-02-19 13:08:22,099 - mmseg - INFO - Iter [114300/160000]	lr: 1.714e-05, eta: 3:37:42, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1133, decode.acc_seg: 95.2023, aux.loss_ce: 0.0661, aux.acc_seg: 93.1478, loss: 0.1794, grad_norm: 1.3484
2023-02-19 13:08:36,056 - mmseg - INFO - Iter [114350/160000]	lr: 1.712e-05, eta: 3:37:28, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1066, decode.acc_seg: 95.3557, aux.loss_ce: 0.0642, aux.acc_seg: 93.1228, loss: 0.1708, grad_norm: 1.5491
2023-02-19 13:08:49,997 - mmseg - INFO - Iter [114400/160000]	lr: 1.710e-05, eta: 3:37:14, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1104, decode.acc_seg: 95.1041, aux.loss_ce: 0.0653, aux.acc_seg: 92.8580, loss: 0.1758, grad_norm: 1.2982
2023-02-19 13:09:04,386 - mmseg - INFO - Iter [114450/160000]	lr: 1.708e-05, eta: 3:36:59, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1138, decode.acc_seg: 95.0488, aux.loss_ce: 0.0667, aux.acc_seg: 92.9163, loss: 0.1805, grad_norm: 1.3788
2023-02-19 13:09:18,200 - mmseg - INFO - Iter [114500/160000]	lr: 1.706e-05, eta: 3:36:45, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1090, decode.acc_seg: 95.3048, aux.loss_ce: 0.0646, aux.acc_seg: 93.1689, loss: 0.1735, grad_norm: 1.5383
2023-02-19 13:09:32,287 - mmseg - INFO - Iter [114550/160000]	lr: 1.704e-05, eta: 3:36:30, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1088, decode.acc_seg: 95.0427, aux.loss_ce: 0.0642, aux.acc_seg: 92.9191, loss: 0.1730, grad_norm: 1.3206
2023-02-19 13:09:46,125 - mmseg - INFO - Iter [114600/160000]	lr: 1.703e-05, eta: 3:36:16, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1085, decode.acc_seg: 95.3138, aux.loss_ce: 0.0652, aux.acc_seg: 93.0618, loss: 0.1736, grad_norm: 1.3337
2023-02-19 13:10:00,301 - mmseg - INFO - Iter [114650/160000]	lr: 1.701e-05, eta: 3:36:02, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1043, decode.acc_seg: 95.4616, aux.loss_ce: 0.0625, aux.acc_seg: 93.3664, loss: 0.1668, grad_norm: 1.1858
2023-02-19 13:10:14,184 - mmseg - INFO - Iter [114700/160000]	lr: 1.699e-05, eta: 3:35:47, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1061, decode.acc_seg: 95.3398, aux.loss_ce: 0.0622, aux.acc_seg: 93.2696, loss: 0.1683, grad_norm: 1.6584
2023-02-19 13:10:28,639 - mmseg - INFO - Iter [114750/160000]	lr: 1.697e-05, eta: 3:35:33, time: 0.289, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1067, decode.acc_seg: 95.2840, aux.loss_ce: 0.0624, aux.acc_seg: 93.2471, loss: 0.1691, grad_norm: 1.4801
2023-02-19 13:10:42,290 - mmseg - INFO - Iter [114800/160000]	lr: 1.695e-05, eta: 3:35:18, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1056, decode.acc_seg: 95.4293, aux.loss_ce: 0.0644, aux.acc_seg: 93.1053, loss: 0.1700, grad_norm: 1.4314
2023-02-19 13:10:56,080 - mmseg - INFO - Iter [114850/160000]	lr: 1.693e-05, eta: 3:35:04, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1090, decode.acc_seg: 95.3153, aux.loss_ce: 0.0659, aux.acc_seg: 93.0095, loss: 0.1749, grad_norm: 1.5305
2023-02-19 13:11:09,907 - mmseg - INFO - Iter [114900/160000]	lr: 1.691e-05, eta: 3:34:49, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1034, decode.acc_seg: 95.3966, aux.loss_ce: 0.0631, aux.acc_seg: 93.2371, loss: 0.1665, grad_norm: 1.3155
2023-02-19 13:11:25,898 - mmseg - INFO - Iter [114950/160000]	lr: 1.689e-05, eta: 3:34:36, time: 0.320, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1087, decode.acc_seg: 95.3048, aux.loss_ce: 0.0638, aux.acc_seg: 93.2453, loss: 0.1725, grad_norm: 1.4377
2023-02-19 13:11:40,095 - mmseg - INFO - Saving checkpoint at 115000 iterations
2023-02-19 13:11:43,317 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:11:43,317 - mmseg - INFO - Iter [115000/160000]	lr: 1.688e-05, eta: 3:34:23, time: 0.349, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1063, decode.acc_seg: 95.3283, aux.loss_ce: 0.0624, aux.acc_seg: 93.2691, loss: 0.1687, grad_norm: 1.6581
2023-02-19 13:11:58,545 - mmseg - INFO - Iter [115050/160000]	lr: 1.686e-05, eta: 3:34:09, time: 0.304, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1008, decode.acc_seg: 95.5649, aux.loss_ce: 0.0601, aux.acc_seg: 93.4769, loss: 0.1609, grad_norm: 1.2150
2023-02-19 13:12:12,198 - mmseg - INFO - Iter [115100/160000]	lr: 1.684e-05, eta: 3:33:54, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1076, decode.acc_seg: 95.2677, aux.loss_ce: 0.0665, aux.acc_seg: 92.9363, loss: 0.1741, grad_norm: 1.6464
2023-02-19 13:12:25,789 - mmseg - INFO - Iter [115150/160000]	lr: 1.682e-05, eta: 3:33:40, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1048, decode.acc_seg: 95.3858, aux.loss_ce: 0.0640, aux.acc_seg: 93.0965, loss: 0.1688, grad_norm: 1.3292
2023-02-19 13:12:40,364 - mmseg - INFO - Iter [115200/160000]	lr: 1.680e-05, eta: 3:33:26, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1135, decode.acc_seg: 95.2977, aux.loss_ce: 0.0661, aux.acc_seg: 93.1537, loss: 0.1796, grad_norm: 1.6524
2023-02-19 13:12:53,902 - mmseg - INFO - Iter [115250/160000]	lr: 1.678e-05, eta: 3:33:11, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1059, decode.acc_seg: 95.2653, aux.loss_ce: 0.0645, aux.acc_seg: 93.0157, loss: 0.1703, grad_norm: 1.5226
2023-02-19 13:13:07,604 - mmseg - INFO - Iter [115300/160000]	lr: 1.676e-05, eta: 3:32:56, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1006, decode.acc_seg: 95.6092, aux.loss_ce: 0.0602, aux.acc_seg: 93.4947, loss: 0.1608, grad_norm: 1.1537
2023-02-19 13:13:21,996 - mmseg - INFO - Iter [115350/160000]	lr: 1.674e-05, eta: 3:32:42, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1032, decode.acc_seg: 95.4003, aux.loss_ce: 0.0616, aux.acc_seg: 93.3294, loss: 0.1648, grad_norm: 1.3373
2023-02-19 13:13:36,096 - mmseg - INFO - Iter [115400/160000]	lr: 1.673e-05, eta: 3:32:28, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1007, decode.acc_seg: 95.5265, aux.loss_ce: 0.0594, aux.acc_seg: 93.5931, loss: 0.1602, grad_norm: 1.3373
2023-02-19 13:13:49,710 - mmseg - INFO - Iter [115450/160000]	lr: 1.671e-05, eta: 3:32:13, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1011, decode.acc_seg: 95.5330, aux.loss_ce: 0.0604, aux.acc_seg: 93.5554, loss: 0.1615, grad_norm: 1.7726
2023-02-19 13:14:03,744 - mmseg - INFO - Iter [115500/160000]	lr: 1.669e-05, eta: 3:31:59, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1028, decode.acc_seg: 95.5568, aux.loss_ce: 0.0631, aux.acc_seg: 93.3170, loss: 0.1660, grad_norm: 1.6900
2023-02-19 13:14:17,820 - mmseg - INFO - Iter [115550/160000]	lr: 1.667e-05, eta: 3:31:45, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1108, decode.acc_seg: 95.2520, aux.loss_ce: 0.0663, aux.acc_seg: 92.9661, loss: 0.1771, grad_norm: 1.4172
2023-02-19 13:14:32,223 - mmseg - INFO - Iter [115600/160000]	lr: 1.665e-05, eta: 3:31:30, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1154, decode.acc_seg: 94.9623, aux.loss_ce: 0.0658, aux.acc_seg: 92.9940, loss: 0.1812, grad_norm: 1.6433
2023-02-19 13:14:46,312 - mmseg - INFO - Iter [115650/160000]	lr: 1.663e-05, eta: 3:31:16, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1057, decode.acc_seg: 95.3055, aux.loss_ce: 0.0625, aux.acc_seg: 93.2118, loss: 0.1682, grad_norm: 1.4113
2023-02-19 13:14:59,963 - mmseg - INFO - Iter [115700/160000]	lr: 1.661e-05, eta: 3:31:01, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1055, decode.acc_seg: 95.3583, aux.loss_ce: 0.0631, aux.acc_seg: 93.2365, loss: 0.1686, grad_norm: 1.4741
2023-02-19 13:15:14,413 - mmseg - INFO - Iter [115750/160000]	lr: 1.659e-05, eta: 3:30:47, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1081, decode.acc_seg: 95.2374, aux.loss_ce: 0.0636, aux.acc_seg: 93.1797, loss: 0.1717, grad_norm: 1.7074
2023-02-19 13:15:28,817 - mmseg - INFO - Iter [115800/160000]	lr: 1.658e-05, eta: 3:30:33, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1076, decode.acc_seg: 95.2934, aux.loss_ce: 0.0656, aux.acc_seg: 93.0991, loss: 0.1732, grad_norm: 1.6153
2023-02-19 13:15:42,937 - mmseg - INFO - Iter [115850/160000]	lr: 1.656e-05, eta: 3:30:19, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1016, decode.acc_seg: 95.4863, aux.loss_ce: 0.0606, aux.acc_seg: 93.4781, loss: 0.1622, grad_norm: 1.3573
2023-02-19 13:15:57,448 - mmseg - INFO - Iter [115900/160000]	lr: 1.654e-05, eta: 3:30:04, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1019, decode.acc_seg: 95.4353, aux.loss_ce: 0.0610, aux.acc_seg: 93.4500, loss: 0.1629, grad_norm: 1.3442
2023-02-19 13:16:11,563 - mmseg - INFO - Iter [115950/160000]	lr: 1.652e-05, eta: 3:29:50, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0985, decode.acc_seg: 95.6520, aux.loss_ce: 0.0596, aux.acc_seg: 93.5437, loss: 0.1581, grad_norm: 1.3060
2023-02-19 13:16:25,419 - mmseg - INFO - Saving checkpoint at 116000 iterations
2023-02-19 13:16:28,694 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:16:28,695 - mmseg - INFO - Iter [116000/160000]	lr: 1.650e-05, eta: 3:29:37, time: 0.343, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1093, decode.acc_seg: 95.2965, aux.loss_ce: 0.0640, aux.acc_seg: 93.2811, loss: 0.1733, grad_norm: 1.6817
2023-02-19 13:16:42,987 - mmseg - INFO - Iter [116050/160000]	lr: 1.648e-05, eta: 3:29:22, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1105, decode.acc_seg: 95.0894, aux.loss_ce: 0.0662, aux.acc_seg: 92.7877, loss: 0.1767, grad_norm: 1.4697
2023-02-19 13:16:57,378 - mmseg - INFO - Iter [116100/160000]	lr: 1.646e-05, eta: 3:29:08, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1108, decode.acc_seg: 95.1618, aux.loss_ce: 0.0647, aux.acc_seg: 93.0018, loss: 0.1754, grad_norm: 1.2767
2023-02-19 13:17:11,633 - mmseg - INFO - Iter [116150/160000]	lr: 1.644e-05, eta: 3:28:54, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1065, decode.acc_seg: 95.2961, aux.loss_ce: 0.0624, aux.acc_seg: 93.1535, loss: 0.1689, grad_norm: 1.3445
2023-02-19 13:17:27,637 - mmseg - INFO - Iter [116200/160000]	lr: 1.643e-05, eta: 3:28:40, time: 0.320, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1130, decode.acc_seg: 95.1238, aux.loss_ce: 0.0674, aux.acc_seg: 92.8882, loss: 0.1803, grad_norm: 1.7098
2023-02-19 13:17:41,722 - mmseg - INFO - Iter [116250/160000]	lr: 1.641e-05, eta: 3:28:26, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1030, decode.acc_seg: 95.4891, aux.loss_ce: 0.0604, aux.acc_seg: 93.5553, loss: 0.1634, grad_norm: 1.3855
2023-02-19 13:17:56,279 - mmseg - INFO - Iter [116300/160000]	lr: 1.639e-05, eta: 3:28:12, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1079, decode.acc_seg: 95.2977, aux.loss_ce: 0.0658, aux.acc_seg: 93.0939, loss: 0.1737, grad_norm: 1.8110
2023-02-19 13:18:10,001 - mmseg - INFO - Iter [116350/160000]	lr: 1.637e-05, eta: 3:27:57, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1099, decode.acc_seg: 95.2528, aux.loss_ce: 0.0632, aux.acc_seg: 93.3146, loss: 0.1730, grad_norm: 1.3571
2023-02-19 13:18:23,771 - mmseg - INFO - Iter [116400/160000]	lr: 1.635e-05, eta: 3:27:43, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1045, decode.acc_seg: 95.3303, aux.loss_ce: 0.0610, aux.acc_seg: 93.3491, loss: 0.1654, grad_norm: 1.3723
2023-02-19 13:18:38,013 - mmseg - INFO - Iter [116450/160000]	lr: 1.633e-05, eta: 3:27:28, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1029, decode.acc_seg: 95.4952, aux.loss_ce: 0.0606, aux.acc_seg: 93.4922, loss: 0.1635, grad_norm: 1.6320
2023-02-19 13:18:51,790 - mmseg - INFO - Iter [116500/160000]	lr: 1.631e-05, eta: 3:27:14, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1018, decode.acc_seg: 95.5924, aux.loss_ce: 0.0593, aux.acc_seg: 93.6355, loss: 0.1610, grad_norm: 1.3219
2023-02-19 13:19:05,928 - mmseg - INFO - Iter [116550/160000]	lr: 1.629e-05, eta: 3:27:00, time: 0.283, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1044, decode.acc_seg: 95.4422, aux.loss_ce: 0.0629, aux.acc_seg: 93.3138, loss: 0.1672, grad_norm: 1.5674
2023-02-19 13:19:20,823 - mmseg - INFO - Iter [116600/160000]	lr: 1.628e-05, eta: 3:26:45, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1113, decode.acc_seg: 95.1215, aux.loss_ce: 0.0662, aux.acc_seg: 92.9504, loss: 0.1776, grad_norm: 2.0979
2023-02-19 13:19:34,645 - mmseg - INFO - Iter [116650/160000]	lr: 1.626e-05, eta: 3:26:31, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1017, decode.acc_seg: 95.4572, aux.loss_ce: 0.0607, aux.acc_seg: 93.4627, loss: 0.1624, grad_norm: 1.4153
2023-02-19 13:19:49,157 - mmseg - INFO - Iter [116700/160000]	lr: 1.624e-05, eta: 3:26:17, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1101, decode.acc_seg: 95.1916, aux.loss_ce: 0.0670, aux.acc_seg: 92.9887, loss: 0.1772, grad_norm: 1.6280
2023-02-19 13:20:03,115 - mmseg - INFO - Iter [116750/160000]	lr: 1.622e-05, eta: 3:26:02, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1089, decode.acc_seg: 95.2834, aux.loss_ce: 0.0648, aux.acc_seg: 93.1408, loss: 0.1737, grad_norm: 1.3769
2023-02-19 13:20:17,794 - mmseg - INFO - Iter [116800/160000]	lr: 1.620e-05, eta: 3:25:48, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1147, decode.acc_seg: 95.0168, aux.loss_ce: 0.0682, aux.acc_seg: 92.5838, loss: 0.1829, grad_norm: 1.9684
2023-02-19 13:20:31,895 - mmseg - INFO - Iter [116850/160000]	lr: 1.618e-05, eta: 3:25:34, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1092, decode.acc_seg: 95.3228, aux.loss_ce: 0.0649, aux.acc_seg: 93.1563, loss: 0.1741, grad_norm: 1.6184
2023-02-19 13:20:46,311 - mmseg - INFO - Iter [116900/160000]	lr: 1.616e-05, eta: 3:25:20, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1033, decode.acc_seg: 95.3822, aux.loss_ce: 0.0611, aux.acc_seg: 93.3569, loss: 0.1644, grad_norm: 1.4384
2023-02-19 13:21:00,256 - mmseg - INFO - Iter [116950/160000]	lr: 1.614e-05, eta: 3:25:05, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1044, decode.acc_seg: 95.4485, aux.loss_ce: 0.0626, aux.acc_seg: 93.2644, loss: 0.1670, grad_norm: 1.4766
2023-02-19 13:21:15,485 - mmseg - INFO - Saving checkpoint at 117000 iterations
2023-02-19 13:21:18,724 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:21:18,724 - mmseg - INFO - Iter [117000/160000]	lr: 1.613e-05, eta: 3:24:52, time: 0.370, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1014, decode.acc_seg: 95.4641, aux.loss_ce: 0.0598, aux.acc_seg: 93.4530, loss: 0.1612, grad_norm: 1.6151
2023-02-19 13:21:32,487 - mmseg - INFO - Iter [117050/160000]	lr: 1.611e-05, eta: 3:24:38, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1069, decode.acc_seg: 95.2384, aux.loss_ce: 0.0649, aux.acc_seg: 92.9564, loss: 0.1718, grad_norm: 1.6002
2023-02-19 13:21:46,442 - mmseg - INFO - Iter [117100/160000]	lr: 1.609e-05, eta: 3:24:24, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1133, decode.acc_seg: 95.1165, aux.loss_ce: 0.0666, aux.acc_seg: 92.9406, loss: 0.1799, grad_norm: 1.4226
2023-02-19 13:22:00,422 - mmseg - INFO - Iter [117150/160000]	lr: 1.607e-05, eta: 3:24:09, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1007, decode.acc_seg: 95.5484, aux.loss_ce: 0.0588, aux.acc_seg: 93.5240, loss: 0.1596, grad_norm: 1.2912
2023-02-19 13:22:14,037 - mmseg - INFO - Iter [117200/160000]	lr: 1.605e-05, eta: 3:23:55, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1115, decode.acc_seg: 95.3275, aux.loss_ce: 0.0630, aux.acc_seg: 93.3695, loss: 0.1745, grad_norm: 1.4970
2023-02-19 13:22:27,708 - mmseg - INFO - Iter [117250/160000]	lr: 1.603e-05, eta: 3:23:40, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1014, decode.acc_seg: 95.6297, aux.loss_ce: 0.0610, aux.acc_seg: 93.4922, loss: 0.1623, grad_norm: 1.2770
2023-02-19 13:22:41,834 - mmseg - INFO - Iter [117300/160000]	lr: 1.601e-05, eta: 3:23:26, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1046, decode.acc_seg: 95.4368, aux.loss_ce: 0.0605, aux.acc_seg: 93.6159, loss: 0.1651, grad_norm: 2.5676
2023-02-19 13:22:55,570 - mmseg - INFO - Iter [117350/160000]	lr: 1.599e-05, eta: 3:23:11, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1070, decode.acc_seg: 95.2489, aux.loss_ce: 0.0613, aux.acc_seg: 93.3705, loss: 0.1682, grad_norm: 1.4028
2023-02-19 13:23:10,398 - mmseg - INFO - Iter [117400/160000]	lr: 1.598e-05, eta: 3:22:57, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1025, decode.acc_seg: 95.4440, aux.loss_ce: 0.0621, aux.acc_seg: 93.3158, loss: 0.1646, grad_norm: 1.2797
2023-02-19 13:23:24,541 - mmseg - INFO - Iter [117450/160000]	lr: 1.596e-05, eta: 3:22:43, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1044, decode.acc_seg: 95.3570, aux.loss_ce: 0.0611, aux.acc_seg: 93.4200, loss: 0.1656, grad_norm: 1.3436
2023-02-19 13:23:40,975 - mmseg - INFO - Iter [117500/160000]	lr: 1.594e-05, eta: 3:22:29, time: 0.329, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1044, decode.acc_seg: 95.3988, aux.loss_ce: 0.0613, aux.acc_seg: 93.3297, loss: 0.1657, grad_norm: 1.2209
2023-02-19 13:23:54,772 - mmseg - INFO - Iter [117550/160000]	lr: 1.592e-05, eta: 3:22:15, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1094, decode.acc_seg: 95.2982, aux.loss_ce: 0.0645, aux.acc_seg: 93.2368, loss: 0.1740, grad_norm: 1.5894
2023-02-19 13:24:08,767 - mmseg - INFO - Iter [117600/160000]	lr: 1.590e-05, eta: 3:22:00, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1072, decode.acc_seg: 95.3277, aux.loss_ce: 0.0637, aux.acc_seg: 93.2300, loss: 0.1710, grad_norm: 1.4845
2023-02-19 13:24:23,237 - mmseg - INFO - Iter [117650/160000]	lr: 1.588e-05, eta: 3:21:46, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1154, decode.acc_seg: 95.3809, aux.loss_ce: 0.0688, aux.acc_seg: 93.1535, loss: 0.1842, grad_norm: 1.8681
2023-02-19 13:24:37,372 - mmseg - INFO - Iter [117700/160000]	lr: 1.586e-05, eta: 3:21:32, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1042, decode.acc_seg: 95.3344, aux.loss_ce: 0.0627, aux.acc_seg: 93.1248, loss: 0.1669, grad_norm: 1.5778
2023-02-19 13:24:51,318 - mmseg - INFO - Iter [117750/160000]	lr: 1.584e-05, eta: 3:21:17, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0987, decode.acc_seg: 95.6376, aux.loss_ce: 0.0580, aux.acc_seg: 93.7209, loss: 0.1567, grad_norm: 1.1826
2023-02-19 13:25:04,983 - mmseg - INFO - Iter [117800/160000]	lr: 1.583e-05, eta: 3:21:03, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1097, decode.acc_seg: 95.2506, aux.loss_ce: 0.0653, aux.acc_seg: 92.9760, loss: 0.1750, grad_norm: 1.4842
2023-02-19 13:25:18,614 - mmseg - INFO - Iter [117850/160000]	lr: 1.581e-05, eta: 3:20:48, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1024, decode.acc_seg: 95.4950, aux.loss_ce: 0.0611, aux.acc_seg: 93.3690, loss: 0.1635, grad_norm: 1.3250
2023-02-19 13:25:32,297 - mmseg - INFO - Iter [117900/160000]	lr: 1.579e-05, eta: 3:20:34, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1113, decode.acc_seg: 95.0840, aux.loss_ce: 0.0644, aux.acc_seg: 93.0541, loss: 0.1758, grad_norm: 1.1899
2023-02-19 13:25:46,434 - mmseg - INFO - Iter [117950/160000]	lr: 1.577e-05, eta: 3:20:19, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1063, decode.acc_seg: 95.2787, aux.loss_ce: 0.0618, aux.acc_seg: 93.3236, loss: 0.1681, grad_norm: 1.3612
2023-02-19 13:26:00,806 - mmseg - INFO - Saving checkpoint at 118000 iterations
2023-02-19 13:26:04,037 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:26:04,037 - mmseg - INFO - Iter [118000/160000]	lr: 1.575e-05, eta: 3:20:06, time: 0.352, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0984, decode.acc_seg: 95.6239, aux.loss_ce: 0.0580, aux.acc_seg: 93.6502, loss: 0.1565, grad_norm: 1.0710
2023-02-19 13:26:17,800 - mmseg - INFO - Iter [118050/160000]	lr: 1.573e-05, eta: 3:19:52, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1083, decode.acc_seg: 95.1870, aux.loss_ce: 0.0642, aux.acc_seg: 93.0782, loss: 0.1725, grad_norm: 1.4833
2023-02-19 13:26:31,748 - mmseg - INFO - Iter [118100/160000]	lr: 1.571e-05, eta: 3:19:37, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1092, decode.acc_seg: 95.1574, aux.loss_ce: 0.0650, aux.acc_seg: 93.1036, loss: 0.1742, grad_norm: 1.9013
2023-02-19 13:26:45,977 - mmseg - INFO - Iter [118150/160000]	lr: 1.569e-05, eta: 3:19:23, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1072, decode.acc_seg: 95.3145, aux.loss_ce: 0.0632, aux.acc_seg: 93.2873, loss: 0.1705, grad_norm: 1.9639
2023-02-19 13:26:59,585 - mmseg - INFO - Iter [118200/160000]	lr: 1.568e-05, eta: 3:19:09, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1049, decode.acc_seg: 95.3752, aux.loss_ce: 0.0624, aux.acc_seg: 93.2383, loss: 0.1674, grad_norm: 1.2849
2023-02-19 13:27:13,418 - mmseg - INFO - Iter [118250/160000]	lr: 1.566e-05, eta: 3:18:54, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1071, decode.acc_seg: 95.3240, aux.loss_ce: 0.0639, aux.acc_seg: 93.1455, loss: 0.1710, grad_norm: 1.7809
2023-02-19 13:27:27,612 - mmseg - INFO - Iter [118300/160000]	lr: 1.564e-05, eta: 3:18:40, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1064, decode.acc_seg: 95.3841, aux.loss_ce: 0.0601, aux.acc_seg: 93.4877, loss: 0.1665, grad_norm: 1.5253
2023-02-19 13:27:41,875 - mmseg - INFO - Iter [118350/160000]	lr: 1.562e-05, eta: 3:18:26, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1052, decode.acc_seg: 95.3180, aux.loss_ce: 0.0615, aux.acc_seg: 93.3220, loss: 0.1667, grad_norm: 1.4709
2023-02-19 13:27:55,929 - mmseg - INFO - Iter [118400/160000]	lr: 1.560e-05, eta: 3:18:11, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1056, decode.acc_seg: 95.3748, aux.loss_ce: 0.0614, aux.acc_seg: 93.3625, loss: 0.1670, grad_norm: 1.2634
2023-02-19 13:28:10,300 - mmseg - INFO - Iter [118450/160000]	lr: 1.558e-05, eta: 3:17:57, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1014, decode.acc_seg: 95.5803, aux.loss_ce: 0.0601, aux.acc_seg: 93.5179, loss: 0.1615, grad_norm: 1.6056
2023-02-19 13:28:24,184 - mmseg - INFO - Iter [118500/160000]	lr: 1.556e-05, eta: 3:17:42, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1108, decode.acc_seg: 95.1943, aux.loss_ce: 0.0634, aux.acc_seg: 93.2157, loss: 0.1742, grad_norm: 1.3355
2023-02-19 13:28:38,425 - mmseg - INFO - Iter [118550/160000]	lr: 1.554e-05, eta: 3:17:28, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0954, decode.acc_seg: 95.7042, aux.loss_ce: 0.0576, aux.acc_seg: 93.6441, loss: 0.1529, grad_norm: 1.2427
2023-02-19 13:28:52,163 - mmseg - INFO - Iter [118600/160000]	lr: 1.553e-05, eta: 3:17:14, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1026, decode.acc_seg: 95.5618, aux.loss_ce: 0.0606, aux.acc_seg: 93.5515, loss: 0.1633, grad_norm: 1.8167
2023-02-19 13:29:06,097 - mmseg - INFO - Iter [118650/160000]	lr: 1.551e-05, eta: 3:16:59, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1007, decode.acc_seg: 95.4745, aux.loss_ce: 0.0588, aux.acc_seg: 93.5073, loss: 0.1595, grad_norm: 1.3310
2023-02-19 13:29:20,285 - mmseg - INFO - Iter [118700/160000]	lr: 1.549e-05, eta: 3:16:45, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1037, decode.acc_seg: 95.4274, aux.loss_ce: 0.0618, aux.acc_seg: 93.3091, loss: 0.1655, grad_norm: 1.5227
2023-02-19 13:29:36,842 - mmseg - INFO - Iter [118750/160000]	lr: 1.547e-05, eta: 3:16:31, time: 0.331, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1088, decode.acc_seg: 95.2733, aux.loss_ce: 0.0671, aux.acc_seg: 93.0794, loss: 0.1760, grad_norm: 1.6320
2023-02-19 13:29:50,748 - mmseg - INFO - Iter [118800/160000]	lr: 1.545e-05, eta: 3:16:17, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1078, decode.acc_seg: 95.2721, aux.loss_ce: 0.0642, aux.acc_seg: 93.0786, loss: 0.1720, grad_norm: 1.4621
2023-02-19 13:30:04,327 - mmseg - INFO - Iter [118850/160000]	lr: 1.543e-05, eta: 3:16:02, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1074, decode.acc_seg: 95.2026, aux.loss_ce: 0.0629, aux.acc_seg: 93.1379, loss: 0.1703, grad_norm: 1.9801
2023-02-19 13:30:18,161 - mmseg - INFO - Iter [118900/160000]	lr: 1.541e-05, eta: 3:15:48, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1010, decode.acc_seg: 95.4962, aux.loss_ce: 0.0615, aux.acc_seg: 93.3564, loss: 0.1624, grad_norm: 1.5218
2023-02-19 13:30:32,355 - mmseg - INFO - Iter [118950/160000]	lr: 1.539e-05, eta: 3:15:34, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1037, decode.acc_seg: 95.4357, aux.loss_ce: 0.0616, aux.acc_seg: 93.3328, loss: 0.1653, grad_norm: 1.3442
2023-02-19 13:30:45,963 - mmseg - INFO - Saving checkpoint at 119000 iterations
2023-02-19 13:30:49,206 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:30:49,206 - mmseg - INFO - Iter [119000/160000]	lr: 1.538e-05, eta: 3:15:20, time: 0.338, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1076, decode.acc_seg: 95.3477, aux.loss_ce: 0.0659, aux.acc_seg: 93.1888, loss: 0.1736, grad_norm: 1.4955
2023-02-19 13:31:03,376 - mmseg - INFO - Iter [119050/160000]	lr: 1.536e-05, eta: 3:15:06, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1052, decode.acc_seg: 95.3775, aux.loss_ce: 0.0623, aux.acc_seg: 93.3123, loss: 0.1675, grad_norm: 1.4492
2023-02-19 13:31:17,653 - mmseg - INFO - Iter [119100/160000]	lr: 1.534e-05, eta: 3:14:52, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1044, decode.acc_seg: 95.3146, aux.loss_ce: 0.0633, aux.acc_seg: 93.0975, loss: 0.1678, grad_norm: 1.4929
2023-02-19 13:31:31,417 - mmseg - INFO - Iter [119150/160000]	lr: 1.532e-05, eta: 3:14:37, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1043, decode.acc_seg: 95.4131, aux.loss_ce: 0.0612, aux.acc_seg: 93.3567, loss: 0.1655, grad_norm: 1.2835
2023-02-19 13:31:45,436 - mmseg - INFO - Iter [119200/160000]	lr: 1.530e-05, eta: 3:14:23, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1021, decode.acc_seg: 95.4763, aux.loss_ce: 0.0603, aux.acc_seg: 93.5129, loss: 0.1625, grad_norm: 1.2882
2023-02-19 13:31:59,626 - mmseg - INFO - Iter [119250/160000]	lr: 1.528e-05, eta: 3:14:08, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1075, decode.acc_seg: 95.2923, aux.loss_ce: 0.0643, aux.acc_seg: 93.1123, loss: 0.1718, grad_norm: 1.2298
2023-02-19 13:32:13,506 - mmseg - INFO - Iter [119300/160000]	lr: 1.526e-05, eta: 3:13:54, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1018, decode.acc_seg: 95.4478, aux.loss_ce: 0.0602, aux.acc_seg: 93.4166, loss: 0.1620, grad_norm: 1.3117
2023-02-19 13:32:27,152 - mmseg - INFO - Iter [119350/160000]	lr: 1.524e-05, eta: 3:13:39, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1013, decode.acc_seg: 95.5463, aux.loss_ce: 0.0609, aux.acc_seg: 93.3353, loss: 0.1621, grad_norm: 1.4412
2023-02-19 13:32:41,238 - mmseg - INFO - Iter [119400/160000]	lr: 1.523e-05, eta: 3:13:25, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1042, decode.acc_seg: 95.4314, aux.loss_ce: 0.0606, aux.acc_seg: 93.5103, loss: 0.1647, grad_norm: 1.7423
2023-02-19 13:32:54,883 - mmseg - INFO - Iter [119450/160000]	lr: 1.521e-05, eta: 3:13:11, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1088, decode.acc_seg: 95.2358, aux.loss_ce: 0.0626, aux.acc_seg: 93.3715, loss: 0.1714, grad_norm: 1.4685
2023-02-19 13:33:09,208 - mmseg - INFO - Iter [119500/160000]	lr: 1.519e-05, eta: 3:12:56, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1101, decode.acc_seg: 95.0883, aux.loss_ce: 0.0642, aux.acc_seg: 93.0377, loss: 0.1742, grad_norm: 1.7400
2023-02-19 13:33:23,122 - mmseg - INFO - Iter [119550/160000]	lr: 1.517e-05, eta: 3:12:42, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1043, decode.acc_seg: 95.4469, aux.loss_ce: 0.0603, aux.acc_seg: 93.5663, loss: 0.1646, grad_norm: 2.7629
2023-02-19 13:33:37,184 - mmseg - INFO - Iter [119600/160000]	lr: 1.515e-05, eta: 3:12:28, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1009, decode.acc_seg: 95.5171, aux.loss_ce: 0.0616, aux.acc_seg: 93.3935, loss: 0.1625, grad_norm: 1.4259
2023-02-19 13:33:50,871 - mmseg - INFO - Iter [119650/160000]	lr: 1.513e-05, eta: 3:12:13, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1035, decode.acc_seg: 95.4222, aux.loss_ce: 0.0664, aux.acc_seg: 93.0999, loss: 0.1699, grad_norm: 1.6115
2023-02-19 13:34:04,928 - mmseg - INFO - Iter [119700/160000]	lr: 1.511e-05, eta: 3:11:59, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1033, decode.acc_seg: 95.4767, aux.loss_ce: 0.0613, aux.acc_seg: 93.3746, loss: 0.1646, grad_norm: 1.5122
2023-02-19 13:34:19,686 - mmseg - INFO - Iter [119750/160000]	lr: 1.509e-05, eta: 3:11:45, time: 0.295, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1052, decode.acc_seg: 95.3215, aux.loss_ce: 0.0597, aux.acc_seg: 93.4337, loss: 0.1649, grad_norm: 1.2878
2023-02-19 13:34:33,649 - mmseg - INFO - Iter [119800/160000]	lr: 1.508e-05, eta: 3:11:30, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1014, decode.acc_seg: 95.5274, aux.loss_ce: 0.0600, aux.acc_seg: 93.5790, loss: 0.1614, grad_norm: 1.4120
2023-02-19 13:34:47,384 - mmseg - INFO - Iter [119850/160000]	lr: 1.506e-05, eta: 3:11:16, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1066, decode.acc_seg: 95.3287, aux.loss_ce: 0.0634, aux.acc_seg: 93.2728, loss: 0.1700, grad_norm: 1.4158
2023-02-19 13:35:02,507 - mmseg - INFO - Iter [119900/160000]	lr: 1.504e-05, eta: 3:11:02, time: 0.302, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1048, decode.acc_seg: 95.4230, aux.loss_ce: 0.0644, aux.acc_seg: 93.1402, loss: 0.1692, grad_norm: 1.5073
2023-02-19 13:35:17,146 - mmseg - INFO - Iter [119950/160000]	lr: 1.502e-05, eta: 3:10:47, time: 0.293, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1056, decode.acc_seg: 95.3781, aux.loss_ce: 0.0622, aux.acc_seg: 93.3622, loss: 0.1678, grad_norm: 1.5025
2023-02-19 13:35:33,164 - mmseg - INFO - Saving checkpoint at 120000 iterations
2023-02-19 13:35:36,462 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:35:36,462 - mmseg - INFO - Iter [120000/160000]	lr: 1.500e-05, eta: 3:10:35, time: 0.386, data_time: 0.046, memory: 15214, decode.loss_ce: 0.1058, decode.acc_seg: 95.3186, aux.loss_ce: 0.0626, aux.acc_seg: 93.2663, loss: 0.1685, grad_norm: 1.6937
2023-02-19 13:35:50,276 - mmseg - INFO - Iter [120050/160000]	lr: 1.498e-05, eta: 3:10:20, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1060, decode.acc_seg: 95.2939, aux.loss_ce: 0.0628, aux.acc_seg: 93.2111, loss: 0.1688, grad_norm: 2.7474
2023-02-19 13:36:04,036 - mmseg - INFO - Iter [120100/160000]	lr: 1.496e-05, eta: 3:10:06, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1106, decode.acc_seg: 95.2525, aux.loss_ce: 0.0666, aux.acc_seg: 93.0867, loss: 0.1772, grad_norm: 1.7992
2023-02-19 13:36:17,985 - mmseg - INFO - Iter [120150/160000]	lr: 1.494e-05, eta: 3:09:51, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1051, decode.acc_seg: 95.2884, aux.loss_ce: 0.0644, aux.acc_seg: 93.0413, loss: 0.1695, grad_norm: 1.6486
2023-02-19 13:36:31,948 - mmseg - INFO - Iter [120200/160000]	lr: 1.493e-05, eta: 3:09:37, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1089, decode.acc_seg: 95.1776, aux.loss_ce: 0.0656, aux.acc_seg: 92.9926, loss: 0.1745, grad_norm: 1.6523
2023-02-19 13:36:45,557 - mmseg - INFO - Iter [120250/160000]	lr: 1.491e-05, eta: 3:09:23, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1006, decode.acc_seg: 95.6718, aux.loss_ce: 0.0605, aux.acc_seg: 93.6468, loss: 0.1611, grad_norm: 1.4433
2023-02-19 13:36:59,650 - mmseg - INFO - Iter [120300/160000]	lr: 1.489e-05, eta: 3:09:08, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0981, decode.acc_seg: 95.6275, aux.loss_ce: 0.0605, aux.acc_seg: 93.5074, loss: 0.1586, grad_norm: 1.6126
2023-02-19 13:37:13,181 - mmseg - INFO - Iter [120350/160000]	lr: 1.487e-05, eta: 3:08:54, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1074, decode.acc_seg: 95.2743, aux.loss_ce: 0.0644, aux.acc_seg: 93.0110, loss: 0.1718, grad_norm: 1.6124
2023-02-19 13:37:26,852 - mmseg - INFO - Iter [120400/160000]	lr: 1.485e-05, eta: 3:08:39, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1085, decode.acc_seg: 95.2030, aux.loss_ce: 0.0646, aux.acc_seg: 93.0419, loss: 0.1732, grad_norm: 1.4710
2023-02-19 13:37:41,114 - mmseg - INFO - Iter [120450/160000]	lr: 1.483e-05, eta: 3:08:25, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1061, decode.acc_seg: 95.2849, aux.loss_ce: 0.0635, aux.acc_seg: 93.0766, loss: 0.1696, grad_norm: 1.6388
2023-02-19 13:37:54,930 - mmseg - INFO - Iter [120500/160000]	lr: 1.481e-05, eta: 3:08:10, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1043, decode.acc_seg: 95.4473, aux.loss_ce: 0.0646, aux.acc_seg: 93.1758, loss: 0.1689, grad_norm: 1.5172
2023-02-19 13:38:08,739 - mmseg - INFO - Iter [120550/160000]	lr: 1.479e-05, eta: 3:07:56, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1037, decode.acc_seg: 95.4959, aux.loss_ce: 0.0635, aux.acc_seg: 93.3209, loss: 0.1672, grad_norm: 1.3225
2023-02-19 13:38:23,296 - mmseg - INFO - Iter [120600/160000]	lr: 1.478e-05, eta: 3:07:42, time: 0.291, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0994, decode.acc_seg: 95.5377, aux.loss_ce: 0.0598, aux.acc_seg: 93.4458, loss: 0.1591, grad_norm: 1.3638
2023-02-19 13:38:36,935 - mmseg - INFO - Iter [120650/160000]	lr: 1.476e-05, eta: 3:07:27, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1038, decode.acc_seg: 95.3540, aux.loss_ce: 0.0620, aux.acc_seg: 93.1989, loss: 0.1658, grad_norm: 1.8868
2023-02-19 13:38:51,195 - mmseg - INFO - Iter [120700/160000]	lr: 1.474e-05, eta: 3:07:13, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0995, decode.acc_seg: 95.5938, aux.loss_ce: 0.0604, aux.acc_seg: 93.4546, loss: 0.1599, grad_norm: 1.7747
2023-02-19 13:39:05,594 - mmseg - INFO - Iter [120750/160000]	lr: 1.472e-05, eta: 3:06:59, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1054, decode.acc_seg: 95.4134, aux.loss_ce: 0.0633, aux.acc_seg: 93.2031, loss: 0.1687, grad_norm: 1.4598
2023-02-19 13:39:19,659 - mmseg - INFO - Iter [120800/160000]	lr: 1.470e-05, eta: 3:06:44, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0990, decode.acc_seg: 95.6343, aux.loss_ce: 0.0605, aux.acc_seg: 93.5721, loss: 0.1594, grad_norm: 1.7246
2023-02-19 13:39:33,278 - mmseg - INFO - Iter [120850/160000]	lr: 1.468e-05, eta: 3:06:30, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1037, decode.acc_seg: 95.4856, aux.loss_ce: 0.0614, aux.acc_seg: 93.4392, loss: 0.1651, grad_norm: 1.3529
2023-02-19 13:39:47,614 - mmseg - INFO - Iter [120900/160000]	lr: 1.466e-05, eta: 3:06:16, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1054, decode.acc_seg: 95.2937, aux.loss_ce: 0.0620, aux.acc_seg: 93.2875, loss: 0.1673, grad_norm: 1.4168
2023-02-19 13:40:01,786 - mmseg - INFO - Iter [120950/160000]	lr: 1.464e-05, eta: 3:06:01, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1046, decode.acc_seg: 95.3625, aux.loss_ce: 0.0618, aux.acc_seg: 93.3707, loss: 0.1664, grad_norm: 1.3764
2023-02-19 13:40:15,682 - mmseg - INFO - Saving checkpoint at 121000 iterations
2023-02-19 13:40:18,984 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:40:18,984 - mmseg - INFO - Iter [121000/160000]	lr: 1.463e-05, eta: 3:05:48, time: 0.344, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1092, decode.acc_seg: 95.2879, aux.loss_ce: 0.0651, aux.acc_seg: 93.1015, loss: 0.1743, grad_norm: 2.0315
2023-02-19 13:40:32,676 - mmseg - INFO - Iter [121050/160000]	lr: 1.461e-05, eta: 3:05:33, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1007, decode.acc_seg: 95.5380, aux.loss_ce: 0.0585, aux.acc_seg: 93.6098, loss: 0.1592, grad_norm: 1.1488
2023-02-19 13:40:46,623 - mmseg - INFO - Iter [121100/160000]	lr: 1.459e-05, eta: 3:05:19, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1020, decode.acc_seg: 95.5217, aux.loss_ce: 0.0616, aux.acc_seg: 93.4810, loss: 0.1636, grad_norm: 1.3687
2023-02-19 13:41:00,293 - mmseg - INFO - Iter [121150/160000]	lr: 1.457e-05, eta: 3:05:04, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0964, decode.acc_seg: 95.5880, aux.loss_ce: 0.0561, aux.acc_seg: 93.7208, loss: 0.1525, grad_norm: 1.3587
2023-02-19 13:41:13,905 - mmseg - INFO - Iter [121200/160000]	lr: 1.455e-05, eta: 3:04:50, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1088, decode.acc_seg: 95.2431, aux.loss_ce: 0.0664, aux.acc_seg: 92.9572, loss: 0.1752, grad_norm: 1.7987
2023-02-19 13:41:29,737 - mmseg - INFO - Iter [121250/160000]	lr: 1.453e-05, eta: 3:04:36, time: 0.317, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1068, decode.acc_seg: 95.4140, aux.loss_ce: 0.0613, aux.acc_seg: 93.5043, loss: 0.1680, grad_norm: 1.5707
2023-02-19 13:41:43,439 - mmseg - INFO - Iter [121300/160000]	lr: 1.451e-05, eta: 3:04:22, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0991, decode.acc_seg: 95.6018, aux.loss_ce: 0.0586, aux.acc_seg: 93.6954, loss: 0.1577, grad_norm: 1.2689
2023-02-19 13:41:57,499 - mmseg - INFO - Iter [121350/160000]	lr: 1.449e-05, eta: 3:04:07, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1035, decode.acc_seg: 95.4972, aux.loss_ce: 0.0617, aux.acc_seg: 93.4798, loss: 0.1652, grad_norm: 1.6604
2023-02-19 13:42:11,902 - mmseg - INFO - Iter [121400/160000]	lr: 1.448e-05, eta: 3:03:53, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1025, decode.acc_seg: 95.4133, aux.loss_ce: 0.0602, aux.acc_seg: 93.5123, loss: 0.1626, grad_norm: 1.5488
2023-02-19 13:42:25,864 - mmseg - INFO - Iter [121450/160000]	lr: 1.446e-05, eta: 3:03:39, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1121, decode.acc_seg: 94.9607, aux.loss_ce: 0.0659, aux.acc_seg: 92.7681, loss: 0.1780, grad_norm: 1.5872
2023-02-19 13:42:40,017 - mmseg - INFO - Iter [121500/160000]	lr: 1.444e-05, eta: 3:03:24, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1071, decode.acc_seg: 95.3425, aux.loss_ce: 0.0634, aux.acc_seg: 93.4066, loss: 0.1704, grad_norm: 1.6337
2023-02-19 13:42:53,614 - mmseg - INFO - Iter [121550/160000]	lr: 1.442e-05, eta: 3:03:10, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1056, decode.acc_seg: 95.3978, aux.loss_ce: 0.0615, aux.acc_seg: 93.2645, loss: 0.1671, grad_norm: 1.8844
2023-02-19 13:43:07,958 - mmseg - INFO - Iter [121600/160000]	lr: 1.440e-05, eta: 3:02:56, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1000, decode.acc_seg: 95.4886, aux.loss_ce: 0.0600, aux.acc_seg: 93.4818, loss: 0.1601, grad_norm: 1.6472
2023-02-19 13:43:22,248 - mmseg - INFO - Iter [121650/160000]	lr: 1.438e-05, eta: 3:02:41, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1041, decode.acc_seg: 95.4906, aux.loss_ce: 0.0613, aux.acc_seg: 93.4737, loss: 0.1655, grad_norm: 1.5865
2023-02-19 13:43:36,152 - mmseg - INFO - Iter [121700/160000]	lr: 1.436e-05, eta: 3:02:27, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0973, decode.acc_seg: 95.7098, aux.loss_ce: 0.0588, aux.acc_seg: 93.6004, loss: 0.1561, grad_norm: 1.2656
2023-02-19 13:43:50,008 - mmseg - INFO - Iter [121750/160000]	lr: 1.434e-05, eta: 3:02:12, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1092, decode.acc_seg: 95.2563, aux.loss_ce: 0.0643, aux.acc_seg: 93.1650, loss: 0.1735, grad_norm: 1.4245
2023-02-19 13:44:04,127 - mmseg - INFO - Iter [121800/160000]	lr: 1.433e-05, eta: 3:01:58, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1040, decode.acc_seg: 95.4439, aux.loss_ce: 0.0607, aux.acc_seg: 93.4244, loss: 0.1646, grad_norm: 2.0334
2023-02-19 13:44:18,237 - mmseg - INFO - Iter [121850/160000]	lr: 1.431e-05, eta: 3:01:44, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1015, decode.acc_seg: 95.4618, aux.loss_ce: 0.0600, aux.acc_seg: 93.6057, loss: 0.1615, grad_norm: 1.3944
2023-02-19 13:44:33,326 - mmseg - INFO - Iter [121900/160000]	lr: 1.429e-05, eta: 3:01:30, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1001, decode.acc_seg: 95.5915, aux.loss_ce: 0.0599, aux.acc_seg: 93.5042, loss: 0.1599, grad_norm: 1.1801
2023-02-19 13:44:47,299 - mmseg - INFO - Iter [121950/160000]	lr: 1.427e-05, eta: 3:01:15, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1060, decode.acc_seg: 95.3110, aux.loss_ce: 0.0625, aux.acc_seg: 93.3098, loss: 0.1685, grad_norm: 1.2358
2023-02-19 13:45:01,214 - mmseg - INFO - Saving checkpoint at 122000 iterations
2023-02-19 13:45:04,520 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:45:04,520 - mmseg - INFO - Iter [122000/160000]	lr: 1.425e-05, eta: 3:01:02, time: 0.345, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1084, decode.acc_seg: 95.3575, aux.loss_ce: 0.0667, aux.acc_seg: 93.0807, loss: 0.1751, grad_norm: 1.5569
2023-02-19 13:45:18,190 - mmseg - INFO - Iter [122050/160000]	lr: 1.423e-05, eta: 3:00:47, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1059, decode.acc_seg: 95.3766, aux.loss_ce: 0.0617, aux.acc_seg: 93.3887, loss: 0.1676, grad_norm: 1.5750
2023-02-19 13:45:31,789 - mmseg - INFO - Iter [122100/160000]	lr: 1.421e-05, eta: 3:00:33, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1051, decode.acc_seg: 95.4205, aux.loss_ce: 0.0642, aux.acc_seg: 93.1110, loss: 0.1693, grad_norm: 1.4847
2023-02-19 13:45:45,557 - mmseg - INFO - Iter [122150/160000]	lr: 1.419e-05, eta: 3:00:18, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1017, decode.acc_seg: 95.6208, aux.loss_ce: 0.0605, aux.acc_seg: 93.6566, loss: 0.1622, grad_norm: 1.5027
2023-02-19 13:45:59,689 - mmseg - INFO - Iter [122200/160000]	lr: 1.418e-05, eta: 3:00:04, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1042, decode.acc_seg: 95.3855, aux.loss_ce: 0.0618, aux.acc_seg: 93.2817, loss: 0.1660, grad_norm: 1.5115
2023-02-19 13:46:14,986 - mmseg - INFO - Iter [122250/160000]	lr: 1.416e-05, eta: 2:59:50, time: 0.306, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0992, decode.acc_seg: 95.6261, aux.loss_ce: 0.0592, aux.acc_seg: 93.6474, loss: 0.1584, grad_norm: 1.2530
2023-02-19 13:46:28,577 - mmseg - INFO - Iter [122300/160000]	lr: 1.414e-05, eta: 2:59:36, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1034, decode.acc_seg: 95.4920, aux.loss_ce: 0.0622, aux.acc_seg: 93.4491, loss: 0.1656, grad_norm: 1.7186
2023-02-19 13:46:42,307 - mmseg - INFO - Iter [122350/160000]	lr: 1.412e-05, eta: 2:59:21, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1049, decode.acc_seg: 95.3528, aux.loss_ce: 0.0614, aux.acc_seg: 93.3101, loss: 0.1662, grad_norm: 1.3350
2023-02-19 13:46:56,171 - mmseg - INFO - Iter [122400/160000]	lr: 1.410e-05, eta: 2:59:07, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0990, decode.acc_seg: 95.5147, aux.loss_ce: 0.0593, aux.acc_seg: 93.3474, loss: 0.1584, grad_norm: 1.3671
2023-02-19 13:47:10,480 - mmseg - INFO - Iter [122450/160000]	lr: 1.408e-05, eta: 2:58:52, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1091, decode.acc_seg: 95.1965, aux.loss_ce: 0.0610, aux.acc_seg: 93.3853, loss: 0.1701, grad_norm: 1.3213
2023-02-19 13:47:24,494 - mmseg - INFO - Iter [122500/160000]	lr: 1.406e-05, eta: 2:58:38, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1033, decode.acc_seg: 95.4315, aux.loss_ce: 0.0620, aux.acc_seg: 93.4012, loss: 0.1653, grad_norm: 1.3207
2023-02-19 13:47:40,278 - mmseg - INFO - Iter [122550/160000]	lr: 1.404e-05, eta: 2:58:24, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1043, decode.acc_seg: 95.4012, aux.loss_ce: 0.0619, aux.acc_seg: 93.4066, loss: 0.1662, grad_norm: 1.4154
2023-02-19 13:47:53,955 - mmseg - INFO - Iter [122600/160000]	lr: 1.403e-05, eta: 2:58:10, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1065, decode.acc_seg: 95.3440, aux.loss_ce: 0.0628, aux.acc_seg: 93.2477, loss: 0.1692, grad_norm: 1.6430
2023-02-19 13:48:08,188 - mmseg - INFO - Iter [122650/160000]	lr: 1.401e-05, eta: 2:57:55, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1025, decode.acc_seg: 95.5567, aux.loss_ce: 0.0613, aux.acc_seg: 93.4804, loss: 0.1638, grad_norm: 1.3289
2023-02-19 13:48:21,785 - mmseg - INFO - Iter [122700/160000]	lr: 1.399e-05, eta: 2:57:41, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1006, decode.acc_seg: 95.5828, aux.loss_ce: 0.0586, aux.acc_seg: 93.7397, loss: 0.1593, grad_norm: 1.4767
2023-02-19 13:48:35,915 - mmseg - INFO - Iter [122750/160000]	lr: 1.397e-05, eta: 2:57:27, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1012, decode.acc_seg: 95.4232, aux.loss_ce: 0.0599, aux.acc_seg: 93.4133, loss: 0.1612, grad_norm: 1.5002
2023-02-19 13:48:49,693 - mmseg - INFO - Iter [122800/160000]	lr: 1.395e-05, eta: 2:57:12, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1080, decode.acc_seg: 95.1893, aux.loss_ce: 0.0635, aux.acc_seg: 93.2036, loss: 0.1716, grad_norm: 1.4945
2023-02-19 13:49:03,444 - mmseg - INFO - Iter [122850/160000]	lr: 1.393e-05, eta: 2:56:58, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1138, decode.acc_seg: 95.2735, aux.loss_ce: 0.0629, aux.acc_seg: 93.3869, loss: 0.1767, grad_norm: 1.6838
2023-02-19 13:49:17,065 - mmseg - INFO - Iter [122900/160000]	lr: 1.391e-05, eta: 2:56:43, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1062, decode.acc_seg: 95.4147, aux.loss_ce: 0.0615, aux.acc_seg: 93.3687, loss: 0.1677, grad_norm: 1.3763
2023-02-19 13:49:30,762 - mmseg - INFO - Iter [122950/160000]	lr: 1.389e-05, eta: 2:56:29, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1056, decode.acc_seg: 95.3551, aux.loss_ce: 0.0645, aux.acc_seg: 93.1223, loss: 0.1701, grad_norm: 2.0411
2023-02-19 13:49:45,160 - mmseg - INFO - Saving checkpoint at 123000 iterations
2023-02-19 13:49:48,411 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:49:48,411 - mmseg - INFO - Iter [123000/160000]	lr: 1.388e-05, eta: 2:56:15, time: 0.353, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1102, decode.acc_seg: 95.1559, aux.loss_ce: 0.0647, aux.acc_seg: 93.0483, loss: 0.1749, grad_norm: 1.6052
2023-02-19 13:50:02,486 - mmseg - INFO - Iter [123050/160000]	lr: 1.386e-05, eta: 2:56:01, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0998, decode.acc_seg: 95.5645, aux.loss_ce: 0.0611, aux.acc_seg: 93.4510, loss: 0.1609, grad_norm: 1.3472
2023-02-19 13:50:16,608 - mmseg - INFO - Iter [123100/160000]	lr: 1.384e-05, eta: 2:55:47, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0984, decode.acc_seg: 95.5891, aux.loss_ce: 0.0581, aux.acc_seg: 93.6115, loss: 0.1565, grad_norm: 1.7338
2023-02-19 13:50:30,396 - mmseg - INFO - Iter [123150/160000]	lr: 1.382e-05, eta: 2:55:32, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1015, decode.acc_seg: 95.4427, aux.loss_ce: 0.0623, aux.acc_seg: 93.1865, loss: 0.1638, grad_norm: 1.4552
2023-02-19 13:50:45,393 - mmseg - INFO - Iter [123200/160000]	lr: 1.380e-05, eta: 2:55:18, time: 0.300, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0962, decode.acc_seg: 95.6867, aux.loss_ce: 0.0574, aux.acc_seg: 93.7055, loss: 0.1536, grad_norm: 1.4475
2023-02-19 13:50:59,416 - mmseg - INFO - Iter [123250/160000]	lr: 1.378e-05, eta: 2:55:04, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1012, decode.acc_seg: 95.4927, aux.loss_ce: 0.0613, aux.acc_seg: 93.2841, loss: 0.1625, grad_norm: 1.5963
2023-02-19 13:51:13,275 - mmseg - INFO - Iter [123300/160000]	lr: 1.376e-05, eta: 2:54:49, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1056, decode.acc_seg: 95.3368, aux.loss_ce: 0.0628, aux.acc_seg: 93.3149, loss: 0.1684, grad_norm: 1.3656
2023-02-19 13:51:27,131 - mmseg - INFO - Iter [123350/160000]	lr: 1.374e-05, eta: 2:54:35, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1026, decode.acc_seg: 95.3409, aux.loss_ce: 0.0625, aux.acc_seg: 93.0502, loss: 0.1651, grad_norm: 1.3848
2023-02-19 13:51:41,306 - mmseg - INFO - Iter [123400/160000]	lr: 1.373e-05, eta: 2:54:21, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0943, decode.acc_seg: 95.7158, aux.loss_ce: 0.0560, aux.acc_seg: 93.8865, loss: 0.1503, grad_norm: 1.2166
2023-02-19 13:51:55,082 - mmseg - INFO - Iter [123450/160000]	lr: 1.371e-05, eta: 2:54:06, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1004, decode.acc_seg: 95.5936, aux.loss_ce: 0.0590, aux.acc_seg: 93.5868, loss: 0.1594, grad_norm: 1.4123
2023-02-19 13:52:08,964 - mmseg - INFO - Iter [123500/160000]	lr: 1.369e-05, eta: 2:53:52, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1049, decode.acc_seg: 95.5162, aux.loss_ce: 0.0624, aux.acc_seg: 93.4030, loss: 0.1673, grad_norm: 1.5188
2023-02-19 13:52:23,167 - mmseg - INFO - Iter [123550/160000]	lr: 1.367e-05, eta: 2:53:38, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1072, decode.acc_seg: 95.3725, aux.loss_ce: 0.0634, aux.acc_seg: 93.1917, loss: 0.1707, grad_norm: 1.8283
2023-02-19 13:52:36,799 - mmseg - INFO - Iter [123600/160000]	lr: 1.365e-05, eta: 2:53:23, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1008, decode.acc_seg: 95.5130, aux.loss_ce: 0.0593, aux.acc_seg: 93.5372, loss: 0.1601, grad_norm: 1.2435
2023-02-19 13:52:50,647 - mmseg - INFO - Iter [123650/160000]	lr: 1.363e-05, eta: 2:53:09, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1011, decode.acc_seg: 95.2766, aux.loss_ce: 0.0575, aux.acc_seg: 93.5194, loss: 0.1586, grad_norm: 0.9982
2023-02-19 13:53:04,529 - mmseg - INFO - Iter [123700/160000]	lr: 1.361e-05, eta: 2:52:54, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1015, decode.acc_seg: 95.5904, aux.loss_ce: 0.0604, aux.acc_seg: 93.5283, loss: 0.1619, grad_norm: 1.5512
2023-02-19 13:53:18,404 - mmseg - INFO - Iter [123750/160000]	lr: 1.359e-05, eta: 2:52:40, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1017, decode.acc_seg: 95.5888, aux.loss_ce: 0.0603, aux.acc_seg: 93.5939, loss: 0.1620, grad_norm: 1.4899
2023-02-19 13:53:34,928 - mmseg - INFO - Iter [123800/160000]	lr: 1.358e-05, eta: 2:52:26, time: 0.330, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1065, decode.acc_seg: 95.2278, aux.loss_ce: 0.0631, aux.acc_seg: 93.1478, loss: 0.1696, grad_norm: 2.1261
2023-02-19 13:53:49,574 - mmseg - INFO - Iter [123850/160000]	lr: 1.356e-05, eta: 2:52:12, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1106, decode.acc_seg: 95.1690, aux.loss_ce: 0.0649, aux.acc_seg: 93.2075, loss: 0.1755, grad_norm: 1.8953
2023-02-19 13:54:03,534 - mmseg - INFO - Iter [123900/160000]	lr: 1.354e-05, eta: 2:51:58, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0950, decode.acc_seg: 95.7509, aux.loss_ce: 0.0572, aux.acc_seg: 93.7120, loss: 0.1522, grad_norm: 1.1471
2023-02-19 13:54:17,376 - mmseg - INFO - Iter [123950/160000]	lr: 1.352e-05, eta: 2:51:43, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0973, decode.acc_seg: 95.5074, aux.loss_ce: 0.0590, aux.acc_seg: 93.4625, loss: 0.1563, grad_norm: 1.6411
2023-02-19 13:54:31,442 - mmseg - INFO - Saving checkpoint at 124000 iterations
2023-02-19 13:54:34,720 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:54:34,720 - mmseg - INFO - Iter [124000/160000]	lr: 1.350e-05, eta: 2:51:30, time: 0.347, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1035, decode.acc_seg: 95.4384, aux.loss_ce: 0.0624, aux.acc_seg: 93.3138, loss: 0.1659, grad_norm: 1.2037
2023-02-19 13:54:48,338 - mmseg - INFO - Iter [124050/160000]	lr: 1.348e-05, eta: 2:51:15, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1031, decode.acc_seg: 95.4584, aux.loss_ce: 0.0599, aux.acc_seg: 93.5224, loss: 0.1630, grad_norm: 1.3635
2023-02-19 13:55:02,079 - mmseg - INFO - Iter [124100/160000]	lr: 1.346e-05, eta: 2:51:01, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1022, decode.acc_seg: 95.5333, aux.loss_ce: 0.0602, aux.acc_seg: 93.4552, loss: 0.1624, grad_norm: 1.8106
2023-02-19 13:55:15,753 - mmseg - INFO - Iter [124150/160000]	lr: 1.344e-05, eta: 2:50:46, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1043, decode.acc_seg: 95.4170, aux.loss_ce: 0.0623, aux.acc_seg: 93.2855, loss: 0.1666, grad_norm: 1.6056
2023-02-19 13:55:29,650 - mmseg - INFO - Iter [124200/160000]	lr: 1.343e-05, eta: 2:50:32, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1028, decode.acc_seg: 95.4793, aux.loss_ce: 0.0605, aux.acc_seg: 93.4670, loss: 0.1633, grad_norm: 1.3973
2023-02-19 13:55:43,595 - mmseg - INFO - Iter [124250/160000]	lr: 1.341e-05, eta: 2:50:18, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1050, decode.acc_seg: 95.3745, aux.loss_ce: 0.0611, aux.acc_seg: 93.4307, loss: 0.1662, grad_norm: 1.4399
2023-02-19 13:55:57,669 - mmseg - INFO - Iter [124300/160000]	lr: 1.339e-05, eta: 2:50:03, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1075, decode.acc_seg: 95.2709, aux.loss_ce: 0.0612, aux.acc_seg: 93.4432, loss: 0.1687, grad_norm: 1.7479
2023-02-19 13:56:11,781 - mmseg - INFO - Iter [124350/160000]	lr: 1.337e-05, eta: 2:49:49, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1008, decode.acc_seg: 95.5376, aux.loss_ce: 0.0604, aux.acc_seg: 93.4491, loss: 0.1612, grad_norm: 1.4749
2023-02-19 13:56:25,655 - mmseg - INFO - Iter [124400/160000]	lr: 1.335e-05, eta: 2:49:34, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0987, decode.acc_seg: 95.6446, aux.loss_ce: 0.0599, aux.acc_seg: 93.5836, loss: 0.1586, grad_norm: 1.4877
2023-02-19 13:56:39,635 - mmseg - INFO - Iter [124450/160000]	lr: 1.333e-05, eta: 2:49:20, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1038, decode.acc_seg: 95.3736, aux.loss_ce: 0.0617, aux.acc_seg: 93.3204, loss: 0.1655, grad_norm: 1.5490
2023-02-19 13:56:53,719 - mmseg - INFO - Iter [124500/160000]	lr: 1.331e-05, eta: 2:49:06, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0984, decode.acc_seg: 95.6509, aux.loss_ce: 0.0596, aux.acc_seg: 93.5230, loss: 0.1581, grad_norm: 1.5235
2023-02-19 13:57:07,320 - mmseg - INFO - Iter [124550/160000]	lr: 1.329e-05, eta: 2:48:51, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1023, decode.acc_seg: 95.4780, aux.loss_ce: 0.0606, aux.acc_seg: 93.4166, loss: 0.1629, grad_norm: 1.4358
2023-02-19 13:57:20,973 - mmseg - INFO - Iter [124600/160000]	lr: 1.328e-05, eta: 2:48:37, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1008, decode.acc_seg: 95.5281, aux.loss_ce: 0.0605, aux.acc_seg: 93.4990, loss: 0.1613, grad_norm: 1.7316
2023-02-19 13:57:34,715 - mmseg - INFO - Iter [124650/160000]	lr: 1.326e-05, eta: 2:48:22, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1021, decode.acc_seg: 95.3974, aux.loss_ce: 0.0591, aux.acc_seg: 93.5752, loss: 0.1612, grad_norm: 2.2380
2023-02-19 13:57:48,509 - mmseg - INFO - Iter [124700/160000]	lr: 1.324e-05, eta: 2:48:08, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0995, decode.acc_seg: 95.5212, aux.loss_ce: 0.0594, aux.acc_seg: 93.4681, loss: 0.1588, grad_norm: 1.3187
2023-02-19 13:58:02,457 - mmseg - INFO - Iter [124750/160000]	lr: 1.322e-05, eta: 2:47:54, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1013, decode.acc_seg: 95.5861, aux.loss_ce: 0.0599, aux.acc_seg: 93.5483, loss: 0.1612, grad_norm: 1.5704
2023-02-19 13:58:16,453 - mmseg - INFO - Iter [124800/160000]	lr: 1.320e-05, eta: 2:47:39, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1080, decode.acc_seg: 95.2060, aux.loss_ce: 0.0635, aux.acc_seg: 93.1373, loss: 0.1714, grad_norm: 1.6024
2023-02-19 13:58:30,366 - mmseg - INFO - Iter [124850/160000]	lr: 1.318e-05, eta: 2:47:25, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0976, decode.acc_seg: 95.6504, aux.loss_ce: 0.0595, aux.acc_seg: 93.6050, loss: 0.1571, grad_norm: 1.2401
2023-02-19 13:58:44,125 - mmseg - INFO - Iter [124900/160000]	lr: 1.316e-05, eta: 2:47:10, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1004, decode.acc_seg: 95.5321, aux.loss_ce: 0.0593, aux.acc_seg: 93.6657, loss: 0.1597, grad_norm: 1.3492
2023-02-19 13:58:58,510 - mmseg - INFO - Iter [124950/160000]	lr: 1.314e-05, eta: 2:46:56, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1027, decode.acc_seg: 95.4444, aux.loss_ce: 0.0628, aux.acc_seg: 93.2317, loss: 0.1654, grad_norm: 1.1055
2023-02-19 13:59:12,487 - mmseg - INFO - Saving checkpoint at 125000 iterations
2023-02-19 13:59:15,714 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 13:59:15,714 - mmseg - INFO - Iter [125000/160000]	lr: 1.313e-05, eta: 2:46:43, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1018, decode.acc_seg: 95.5217, aux.loss_ce: 0.0599, aux.acc_seg: 93.6244, loss: 0.1618, grad_norm: 1.8143
2023-02-19 13:59:31,898 - mmseg - INFO - Iter [125050/160000]	lr: 1.311e-05, eta: 2:46:29, time: 0.324, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1042, decode.acc_seg: 95.3432, aux.loss_ce: 0.0646, aux.acc_seg: 93.0252, loss: 0.1688, grad_norm: 1.4409
2023-02-19 13:59:45,837 - mmseg - INFO - Iter [125100/160000]	lr: 1.309e-05, eta: 2:46:14, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1030, decode.acc_seg: 95.4691, aux.loss_ce: 0.0604, aux.acc_seg: 93.5757, loss: 0.1634, grad_norm: 1.6580
2023-02-19 13:59:59,602 - mmseg - INFO - Iter [125150/160000]	lr: 1.307e-05, eta: 2:46:00, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1069, decode.acc_seg: 95.4029, aux.loss_ce: 0.0637, aux.acc_seg: 93.3203, loss: 0.1706, grad_norm: 2.1335
2023-02-19 14:00:13,126 - mmseg - INFO - Iter [125200/160000]	lr: 1.305e-05, eta: 2:45:45, time: 0.270, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1054, decode.acc_seg: 95.4394, aux.loss_ce: 0.0607, aux.acc_seg: 93.5413, loss: 0.1661, grad_norm: 1.3408
2023-02-19 14:00:27,310 - mmseg - INFO - Iter [125250/160000]	lr: 1.303e-05, eta: 2:45:31, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1074, decode.acc_seg: 95.2810, aux.loss_ce: 0.0646, aux.acc_seg: 93.0796, loss: 0.1721, grad_norm: 1.5697
2023-02-19 14:00:41,492 - mmseg - INFO - Iter [125300/160000]	lr: 1.301e-05, eta: 2:45:17, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1047, decode.acc_seg: 95.5000, aux.loss_ce: 0.0615, aux.acc_seg: 93.4451, loss: 0.1661, grad_norm: 1.5451
2023-02-19 14:00:55,675 - mmseg - INFO - Iter [125350/160000]	lr: 1.299e-05, eta: 2:45:03, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0977, decode.acc_seg: 95.6491, aux.loss_ce: 0.0589, aux.acc_seg: 93.5536, loss: 0.1566, grad_norm: 1.2469
2023-02-19 14:01:09,774 - mmseg - INFO - Iter [125400/160000]	lr: 1.298e-05, eta: 2:44:48, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1023, decode.acc_seg: 95.5406, aux.loss_ce: 0.0622, aux.acc_seg: 93.3294, loss: 0.1645, grad_norm: 1.3195
2023-02-19 14:01:24,268 - mmseg - INFO - Iter [125450/160000]	lr: 1.296e-05, eta: 2:44:34, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0971, decode.acc_seg: 95.6084, aux.loss_ce: 0.0589, aux.acc_seg: 93.4911, loss: 0.1561, grad_norm: 1.3563
2023-02-19 14:01:38,722 - mmseg - INFO - Iter [125500/160000]	lr: 1.294e-05, eta: 2:44:20, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1036, decode.acc_seg: 95.4123, aux.loss_ce: 0.0603, aux.acc_seg: 93.5160, loss: 0.1639, grad_norm: 1.4698
2023-02-19 14:01:53,018 - mmseg - INFO - Iter [125550/160000]	lr: 1.292e-05, eta: 2:44:05, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1001, decode.acc_seg: 95.6018, aux.loss_ce: 0.0576, aux.acc_seg: 93.8035, loss: 0.1577, grad_norm: 1.4390
2023-02-19 14:02:07,522 - mmseg - INFO - Iter [125600/160000]	lr: 1.290e-05, eta: 2:43:51, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1010, decode.acc_seg: 95.4796, aux.loss_ce: 0.0606, aux.acc_seg: 93.4673, loss: 0.1616, grad_norm: 1.6918
2023-02-19 14:02:21,683 - mmseg - INFO - Iter [125650/160000]	lr: 1.288e-05, eta: 2:43:37, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1074, decode.acc_seg: 95.3804, aux.loss_ce: 0.0645, aux.acc_seg: 93.2991, loss: 0.1719, grad_norm: 1.3849
2023-02-19 14:02:35,511 - mmseg - INFO - Iter [125700/160000]	lr: 1.286e-05, eta: 2:43:22, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1009, decode.acc_seg: 95.4122, aux.loss_ce: 0.0596, aux.acc_seg: 93.4473, loss: 0.1605, grad_norm: 1.3927
2023-02-19 14:02:49,360 - mmseg - INFO - Iter [125750/160000]	lr: 1.284e-05, eta: 2:43:08, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1069, decode.acc_seg: 95.3565, aux.loss_ce: 0.0644, aux.acc_seg: 93.2017, loss: 0.1713, grad_norm: 1.7185
2023-02-19 14:03:03,417 - mmseg - INFO - Iter [125800/160000]	lr: 1.283e-05, eta: 2:42:54, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1023, decode.acc_seg: 95.4444, aux.loss_ce: 0.0594, aux.acc_seg: 93.5300, loss: 0.1617, grad_norm: 1.5404
2023-02-19 14:03:17,053 - mmseg - INFO - Iter [125850/160000]	lr: 1.281e-05, eta: 2:42:39, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0968, decode.acc_seg: 95.7379, aux.loss_ce: 0.0588, aux.acc_seg: 93.6360, loss: 0.1555, grad_norm: 1.1335
2023-02-19 14:03:31,931 - mmseg - INFO - Iter [125900/160000]	lr: 1.279e-05, eta: 2:42:25, time: 0.298, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0988, decode.acc_seg: 95.5909, aux.loss_ce: 0.0602, aux.acc_seg: 93.5185, loss: 0.1589, grad_norm: 1.4049
2023-02-19 14:03:46,123 - mmseg - INFO - Iter [125950/160000]	lr: 1.277e-05, eta: 2:42:11, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1036, decode.acc_seg: 95.4977, aux.loss_ce: 0.0596, aux.acc_seg: 93.7023, loss: 0.1632, grad_norm: 1.1718
2023-02-19 14:03:59,807 - mmseg - INFO - Saving checkpoint at 126000 iterations
2023-02-19 14:04:03,121 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:04:03,121 - mmseg - INFO - Iter [126000/160000]	lr: 1.275e-05, eta: 2:41:57, time: 0.340, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0992, decode.acc_seg: 95.6041, aux.loss_ce: 0.0580, aux.acc_seg: 93.8295, loss: 0.1572, grad_norm: 1.3006
2023-02-19 14:04:16,777 - mmseg - INFO - Iter [126050/160000]	lr: 1.273e-05, eta: 2:41:43, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0995, decode.acc_seg: 95.6042, aux.loss_ce: 0.0587, aux.acc_seg: 93.6811, loss: 0.1583, grad_norm: 1.3599
2023-02-19 14:04:30,403 - mmseg - INFO - Iter [126100/160000]	lr: 1.271e-05, eta: 2:41:28, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1048, decode.acc_seg: 95.3113, aux.loss_ce: 0.0613, aux.acc_seg: 93.3319, loss: 0.1661, grad_norm: 1.5584
2023-02-19 14:04:44,178 - mmseg - INFO - Iter [126150/160000]	lr: 1.269e-05, eta: 2:41:14, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1014, decode.acc_seg: 95.4297, aux.loss_ce: 0.0590, aux.acc_seg: 93.5331, loss: 0.1604, grad_norm: 1.1800
2023-02-19 14:04:58,128 - mmseg - INFO - Iter [126200/160000]	lr: 1.268e-05, eta: 2:40:59, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1005, decode.acc_seg: 95.5231, aux.loss_ce: 0.0617, aux.acc_seg: 93.3103, loss: 0.1622, grad_norm: 1.2736
2023-02-19 14:05:11,820 - mmseg - INFO - Iter [126250/160000]	lr: 1.266e-05, eta: 2:40:45, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1029, decode.acc_seg: 95.3440, aux.loss_ce: 0.0631, aux.acc_seg: 93.1132, loss: 0.1660, grad_norm: 1.5389
2023-02-19 14:05:25,407 - mmseg - INFO - Iter [126300/160000]	lr: 1.264e-05, eta: 2:40:31, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1018, decode.acc_seg: 95.5228, aux.loss_ce: 0.0616, aux.acc_seg: 93.3812, loss: 0.1634, grad_norm: 1.3204
2023-02-19 14:05:41,230 - mmseg - INFO - Iter [126350/160000]	lr: 1.262e-05, eta: 2:40:17, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1026, decode.acc_seg: 95.4968, aux.loss_ce: 0.0602, aux.acc_seg: 93.4425, loss: 0.1628, grad_norm: 1.1457
2023-02-19 14:05:54,887 - mmseg - INFO - Iter [126400/160000]	lr: 1.260e-05, eta: 2:40:02, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0965, decode.acc_seg: 95.7721, aux.loss_ce: 0.0581, aux.acc_seg: 93.7711, loss: 0.1546, grad_norm: 1.1953
2023-02-19 14:06:08,735 - mmseg - INFO - Iter [126450/160000]	lr: 1.258e-05, eta: 2:39:48, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1002, decode.acc_seg: 95.5694, aux.loss_ce: 0.0602, aux.acc_seg: 93.5582, loss: 0.1604, grad_norm: 1.7075
2023-02-19 14:06:22,465 - mmseg - INFO - Iter [126500/160000]	lr: 1.256e-05, eta: 2:39:33, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1045, decode.acc_seg: 95.4456, aux.loss_ce: 0.0607, aux.acc_seg: 93.5598, loss: 0.1652, grad_norm: 1.3747
2023-02-19 14:06:36,592 - mmseg - INFO - Iter [126550/160000]	lr: 1.254e-05, eta: 2:39:19, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1012, decode.acc_seg: 95.6655, aux.loss_ce: 0.0611, aux.acc_seg: 93.5642, loss: 0.1623, grad_norm: 1.5142
2023-02-19 14:06:50,956 - mmseg - INFO - Iter [126600/160000]	lr: 1.253e-05, eta: 2:39:05, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0995, decode.acc_seg: 95.5728, aux.loss_ce: 0.0592, aux.acc_seg: 93.6494, loss: 0.1586, grad_norm: 1.2633
2023-02-19 14:07:04,607 - mmseg - INFO - Iter [126650/160000]	lr: 1.251e-05, eta: 2:38:50, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1043, decode.acc_seg: 95.2931, aux.loss_ce: 0.0610, aux.acc_seg: 93.3012, loss: 0.1653, grad_norm: 1.2705
2023-02-19 14:07:18,258 - mmseg - INFO - Iter [126700/160000]	lr: 1.249e-05, eta: 2:38:36, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1065, decode.acc_seg: 95.4567, aux.loss_ce: 0.0643, aux.acc_seg: 93.3797, loss: 0.1707, grad_norm: 1.2968
2023-02-19 14:07:31,920 - mmseg - INFO - Iter [126750/160000]	lr: 1.247e-05, eta: 2:38:21, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0996, decode.acc_seg: 95.5585, aux.loss_ce: 0.0609, aux.acc_seg: 93.5005, loss: 0.1605, grad_norm: 1.6174
2023-02-19 14:07:45,999 - mmseg - INFO - Iter [126800/160000]	lr: 1.245e-05, eta: 2:38:07, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1032, decode.acc_seg: 95.4398, aux.loss_ce: 0.0611, aux.acc_seg: 93.3914, loss: 0.1643, grad_norm: 1.1554
2023-02-19 14:08:00,326 - mmseg - INFO - Iter [126850/160000]	lr: 1.243e-05, eta: 2:37:53, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0998, decode.acc_seg: 95.5266, aux.loss_ce: 0.0588, aux.acc_seg: 93.6042, loss: 0.1586, grad_norm: 1.4539
2023-02-19 14:08:14,561 - mmseg - INFO - Iter [126900/160000]	lr: 1.241e-05, eta: 2:37:38, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1028, decode.acc_seg: 95.4527, aux.loss_ce: 0.0606, aux.acc_seg: 93.4416, loss: 0.1634, grad_norm: 1.5900
2023-02-19 14:08:28,480 - mmseg - INFO - Iter [126950/160000]	lr: 1.239e-05, eta: 2:37:24, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0997, decode.acc_seg: 95.6037, aux.loss_ce: 0.0591, aux.acc_seg: 93.6470, loss: 0.1588, grad_norm: 1.2044
2023-02-19 14:08:42,246 - mmseg - INFO - Saving checkpoint at 127000 iterations
2023-02-19 14:08:45,504 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:08:45,504 - mmseg - INFO - Iter [127000/160000]	lr: 1.238e-05, eta: 2:37:10, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1058, decode.acc_seg: 95.4005, aux.loss_ce: 0.0612, aux.acc_seg: 93.4577, loss: 0.1670, grad_norm: 1.5659
2023-02-19 14:08:59,428 - mmseg - INFO - Iter [127050/160000]	lr: 1.236e-05, eta: 2:36:56, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0965, decode.acc_seg: 95.6659, aux.loss_ce: 0.0575, aux.acc_seg: 93.7352, loss: 0.1540, grad_norm: 1.2818
2023-02-19 14:09:13,096 - mmseg - INFO - Iter [127100/160000]	lr: 1.234e-05, eta: 2:36:42, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1029, decode.acc_seg: 95.3792, aux.loss_ce: 0.0620, aux.acc_seg: 93.2827, loss: 0.1649, grad_norm: 1.6265
2023-02-19 14:09:27,334 - mmseg - INFO - Iter [127150/160000]	lr: 1.232e-05, eta: 2:36:27, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1012, decode.acc_seg: 95.5066, aux.loss_ce: 0.0599, aux.acc_seg: 93.5211, loss: 0.1611, grad_norm: 1.2390
2023-02-19 14:09:41,198 - mmseg - INFO - Iter [127200/160000]	lr: 1.230e-05, eta: 2:36:13, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1006, decode.acc_seg: 95.5444, aux.loss_ce: 0.0597, aux.acc_seg: 93.5226, loss: 0.1603, grad_norm: 1.4225
2023-02-19 14:09:55,705 - mmseg - INFO - Iter [127250/160000]	lr: 1.228e-05, eta: 2:35:59, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1006, decode.acc_seg: 95.5593, aux.loss_ce: 0.0606, aux.acc_seg: 93.4747, loss: 0.1612, grad_norm: 1.2129
2023-02-19 14:10:09,444 - mmseg - INFO - Iter [127300/160000]	lr: 1.226e-05, eta: 2:35:44, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1016, decode.acc_seg: 95.4752, aux.loss_ce: 0.0583, aux.acc_seg: 93.6223, loss: 0.1599, grad_norm: 1.4192
2023-02-19 14:10:23,044 - mmseg - INFO - Iter [127350/160000]	lr: 1.224e-05, eta: 2:35:30, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1011, decode.acc_seg: 95.5075, aux.loss_ce: 0.0609, aux.acc_seg: 93.4198, loss: 0.1620, grad_norm: 1.3576
2023-02-19 14:10:37,762 - mmseg - INFO - Iter [127400/160000]	lr: 1.223e-05, eta: 2:35:16, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1005, decode.acc_seg: 95.6279, aux.loss_ce: 0.0589, aux.acc_seg: 93.7219, loss: 0.1594, grad_norm: 1.2613
2023-02-19 14:10:52,345 - mmseg - INFO - Iter [127450/160000]	lr: 1.221e-05, eta: 2:35:01, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1030, decode.acc_seg: 95.3454, aux.loss_ce: 0.0622, aux.acc_seg: 93.2080, loss: 0.1652, grad_norm: 1.4792
2023-02-19 14:11:06,178 - mmseg - INFO - Iter [127500/160000]	lr: 1.219e-05, eta: 2:34:47, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0939, decode.acc_seg: 95.8264, aux.loss_ce: 0.0556, aux.acc_seg: 94.0150, loss: 0.1495, grad_norm: 1.0964
2023-02-19 14:11:20,598 - mmseg - INFO - Iter [127550/160000]	lr: 1.217e-05, eta: 2:34:33, time: 0.289, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0984, decode.acc_seg: 95.4982, aux.loss_ce: 0.0586, aux.acc_seg: 93.5772, loss: 0.1570, grad_norm: 1.3803
2023-02-19 14:11:36,599 - mmseg - INFO - Iter [127600/160000]	lr: 1.215e-05, eta: 2:34:19, time: 0.320, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1052, decode.acc_seg: 95.2918, aux.loss_ce: 0.0612, aux.acc_seg: 93.3496, loss: 0.1665, grad_norm: 1.4107
2023-02-19 14:11:50,174 - mmseg - INFO - Iter [127650/160000]	lr: 1.213e-05, eta: 2:34:04, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1017, decode.acc_seg: 95.5345, aux.loss_ce: 0.0608, aux.acc_seg: 93.4319, loss: 0.1625, grad_norm: 1.4244
2023-02-19 14:12:03,877 - mmseg - INFO - Iter [127700/160000]	lr: 1.211e-05, eta: 2:33:50, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1084, decode.acc_seg: 95.2413, aux.loss_ce: 0.0650, aux.acc_seg: 93.0486, loss: 0.1734, grad_norm: 1.4193
2023-02-19 14:12:17,669 - mmseg - INFO - Iter [127750/160000]	lr: 1.209e-05, eta: 2:33:36, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1035, decode.acc_seg: 95.4268, aux.loss_ce: 0.0624, aux.acc_seg: 93.2207, loss: 0.1659, grad_norm: 1.4485
2023-02-19 14:12:32,059 - mmseg - INFO - Iter [127800/160000]	lr: 1.208e-05, eta: 2:33:21, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0988, decode.acc_seg: 95.5997, aux.loss_ce: 0.0592, aux.acc_seg: 93.6241, loss: 0.1580, grad_norm: 1.1985
2023-02-19 14:12:46,035 - mmseg - INFO - Iter [127850/160000]	lr: 1.206e-05, eta: 2:33:07, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0974, decode.acc_seg: 95.7232, aux.loss_ce: 0.0591, aux.acc_seg: 93.7388, loss: 0.1564, grad_norm: 1.3371
2023-02-19 14:12:59,926 - mmseg - INFO - Iter [127900/160000]	lr: 1.204e-05, eta: 2:32:53, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0916, decode.acc_seg: 95.8624, aux.loss_ce: 0.0557, aux.acc_seg: 93.8525, loss: 0.1473, grad_norm: 1.2268
2023-02-19 14:13:13,881 - mmseg - INFO - Iter [127950/160000]	lr: 1.202e-05, eta: 2:32:38, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0990, decode.acc_seg: 95.6376, aux.loss_ce: 0.0579, aux.acc_seg: 93.8210, loss: 0.1569, grad_norm: 1.1665
2023-02-19 14:13:27,521 - mmseg - INFO - Saving checkpoint at 128000 iterations
2023-02-19 14:13:30,808 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:13:30,808 - mmseg - INFO - Iter [128000/160000]	lr: 1.200e-05, eta: 2:32:25, time: 0.339, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1006, decode.acc_seg: 95.5275, aux.loss_ce: 0.0586, aux.acc_seg: 93.6467, loss: 0.1592, grad_norm: 1.4577
2023-02-19 14:13:45,811 - mmseg - INFO - per class results:
2023-02-19 14:13:45,816 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        | 79.16 | 88.68 |
|       building      | 82.55 | 91.94 |
|         sky         |  94.5 | 97.72 |
|        floor        | 81.12 | 90.74 |
|         tree        |  75.6 |  88.9 |
|       ceiling       | 85.09 | 94.05 |
|         road        | 84.03 | 90.27 |
|         bed         | 90.98 | 96.07 |
|      windowpane     | 63.56 | 78.58 |
|        grass        | 67.73 | 82.53 |
|       cabinet       | 63.99 | 76.98 |
|       sidewalk      | 70.81 |  85.1 |
|        person       | 82.21 | 93.52 |
|        earth        | 36.67 | 49.55 |
|         door        | 54.37 | 69.86 |
|        table        | 65.99 | 78.83 |
|       mountain      | 59.04 | 71.94 |
|        plant        |  51.7 | 64.77 |
|       curtain       | 75.93 | 86.37 |
|        chair        | 64.97 | 81.32 |
|         car         | 85.42 | 92.59 |
|        water        | 53.72 | 66.12 |
|       painting      | 78.76 | 88.34 |
|         sofa        | 74.87 | 87.67 |
|        shelf        | 45.94 | 62.94 |
|        house        | 36.74 | 46.53 |
|         sea         | 63.76 | 84.52 |
|        mirror       | 72.06 |  82.1 |
|         rug         |  50.7 | 58.86 |
|        field        | 31.65 | 47.93 |
|       armchair      | 50.64 | 63.63 |
|         seat        | 62.03 | 86.48 |
|        fence        | 44.82 | 66.63 |
|         desk        | 53.45 | 76.21 |
|         rock        | 50.34 | 72.54 |
|       wardrobe      | 51.04 | 65.98 |
|         lamp        | 68.65 | 78.05 |
|       bathtub       |  82.6 |  86.8 |
|       railing       | 38.72 | 52.38 |
|       cushion       | 62.81 | 77.54 |
|         base        | 43.46 | 57.78 |
|         box         | 29.86 | 37.63 |
|        column       | 50.44 | 64.92 |
|      signboard      | 41.82 | 58.14 |
|   chest of drawers  | 40.15 | 59.59 |
|       counter       | 29.49 | 37.77 |
|         sand        | 55.99 | 82.31 |
|         sink        | 74.36 |  82.7 |
|      skyscraper     | 52.34 | 66.27 |
|      fireplace      | 77.03 | 91.63 |
|     refrigerator    | 85.76 | 92.13 |
|      grandstand     | 46.43 |  76.1 |
|         path        | 24.88 | 36.92 |
|        stairs       |  28.0 | 36.65 |
|        runway       | 68.19 | 89.53 |
|         case        | 51.54 |  72.5 |
|      pool table     | 93.59 | 96.95 |
|        pillow       | 60.82 | 69.67 |
|     screen door     | 77.45 | 80.11 |
|       stairway      | 30.96 | 42.56 |
|        river        |  9.95 | 25.11 |
|        bridge       |  71.3 | 83.66 |
|       bookcase      | 49.11 | 66.63 |
|        blind        |  47.5 | 53.02 |
|     coffee table    | 64.52 | 77.75 |
|        toilet       | 87.77 | 91.78 |
|        flower       | 42.68 |  59.0 |
|         book        |  44.7 | 63.66 |
|         hill        | 12.82 | 18.81 |
|        bench        | 48.42 | 54.34 |
|      countertop     | 55.61 | 79.73 |
|        stove        | 81.65 |  85.7 |
|         palm        | 55.57 | 82.91 |
|    kitchen island   | 48.07 | 79.06 |
|       computer      |  75.3 | 83.93 |
|     swivel chair    | 44.22 | 61.46 |
|         boat        | 53.76 | 59.74 |
|         bar         |  46.0 | 60.79 |
|    arcade machine   | 38.84 | 41.88 |
|        hovel        | 47.33 | 52.65 |
|         bus         | 89.66 | 96.98 |
|        towel        | 72.27 | 83.55 |
|        light        |  58.4 | 67.43 |
|        truck        | 40.51 | 49.45 |
|        tower        |  32.1 | 54.84 |
|      chandelier     |  70.8 | 83.01 |
|        awning       | 38.33 | 48.97 |
|     streetlight     | 35.28 | 46.79 |
|        booth        | 49.11 | 59.04 |
| television receiver | 69.19 | 80.42 |
|       airplane      | 55.77 | 72.63 |
|      dirt track     | 11.79 | 42.31 |
|       apparel       |  43.8 | 60.91 |
|         pole        | 27.76 | 40.58 |
|         land        |  4.96 |  7.49 |
|      bannister      |  15.6 | 19.77 |
|      escalator      | 50.39 | 70.32 |
|       ottoman       | 49.06 | 68.11 |
|        bottle       | 37.82 | 59.87 |
|        buffet       | 40.62 | 47.58 |
|        poster       | 26.49 |  37.6 |
|        stage        | 19.78 | 30.98 |
|         van         | 38.22 | 55.93 |
|         ship        | 65.86 | 96.66 |
|       fountain      |  25.0 | 25.29 |
|    conveyer belt    | 79.19 | 90.04 |
|        canopy       | 27.37 | 31.65 |
|        washer       | 71.15 | 73.05 |
|      plaything      | 32.61 | 45.44 |
|    swimming pool    | 53.68 | 70.01 |
|        stool        | 47.54 | 56.76 |
|        barrel       | 27.93 | 74.68 |
|        basket       | 43.41 |  57.3 |
|      waterfall      | 50.62 | 61.22 |
|         tent        |  92.3 | 97.81 |
|         bag         | 17.02 | 21.05 |
|       minibike      | 71.97 | 89.84 |
|        cradle       | 82.05 | 91.77 |
|         oven        | 38.01 | 68.65 |
|         ball        | 52.02 | 61.38 |
|         food        |  57.0 | 65.38 |
|         step        |  9.01 | 10.03 |
|         tank        | 59.67 | 64.24 |
|      trade name     |  31.0 | 40.33 |
|      microwave      | 65.92 | 72.98 |
|         pot         | 52.56 | 59.16 |
|        animal       | 60.34 |  63.5 |
|       bicycle       | 59.72 | 81.06 |
|         lake        | 47.64 | 55.03 |
|      dishwasher     | 74.21 | 82.23 |
|        screen       | 55.75 | 70.51 |
|       blanket       | 22.92 | 26.98 |
|      sculpture      | 69.49 | 86.09 |
|         hood        | 72.35 |  75.9 |
|        sconce       | 54.37 | 66.28 |
|         vase        | 42.87 | 56.15 |
|    traffic light    | 35.74 | 52.19 |
|         tray        | 15.51 | 19.18 |
|        ashcan       | 43.75 | 61.58 |
|         fan         | 71.27 | 84.57 |
|         pier        | 33.64 | 69.62 |
|      crt screen     |  5.11 |  17.4 |
|        plate        | 59.71 | 79.04 |
|       monitor       | 14.51 | 16.15 |
|    bulletin board   | 26.29 | 31.06 |
|        shower       |  9.79 | 23.35 |
|       radiator      | 72.26 | 83.63 |
|        glass        | 16.35 | 18.26 |
|        clock        | 47.61 | 52.26 |
|         flag        | 47.27 | 55.49 |
+---------------------+-------+-------+
2023-02-19 14:13:45,817 - mmseg - INFO - Summary:
2023-02-19 14:13:45,817 - mmseg - INFO - 
+-------+-------+-------+
|  aAcc |  mIoU |  mAcc |
+-------+-------+-------+
| 84.16 | 52.44 | 65.01 |
+-------+-------+-------+
2023-02-19 14:13:49,058 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_128000.pth.
2023-02-19 14:13:49,058 - mmseg - INFO - Best mIoU is 0.5244 at 128000 iter.
2023-02-19 14:13:49,058 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:13:49,058 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8416, mIoU: 0.5244, mAcc: 0.6501, IoU.wall: 0.7916, IoU.building: 0.8255, IoU.sky: 0.9450, IoU.floor: 0.8112, IoU.tree: 0.7560, IoU.ceiling: 0.8509, IoU.road: 0.8403, IoU.bed : 0.9098, IoU.windowpane: 0.6356, IoU.grass: 0.6773, IoU.cabinet: 0.6399, IoU.sidewalk: 0.7081, IoU.person: 0.8221, IoU.earth: 0.3667, IoU.door: 0.5437, IoU.table: 0.6599, IoU.mountain: 0.5904, IoU.plant: 0.5170, IoU.curtain: 0.7593, IoU.chair: 0.6497, IoU.car: 0.8542, IoU.water: 0.5372, IoU.painting: 0.7876, IoU.sofa: 0.7487, IoU.shelf: 0.4594, IoU.house: 0.3674, IoU.sea: 0.6376, IoU.mirror: 0.7206, IoU.rug: 0.5070, IoU.field: 0.3165, IoU.armchair: 0.5064, IoU.seat: 0.6203, IoU.fence: 0.4482, IoU.desk: 0.5345, IoU.rock: 0.5034, IoU.wardrobe: 0.5104, IoU.lamp: 0.6865, IoU.bathtub: 0.8260, IoU.railing: 0.3872, IoU.cushion: 0.6281, IoU.base: 0.4346, IoU.box: 0.2986, IoU.column: 0.5044, IoU.signboard: 0.4182, IoU.chest of drawers: 0.4015, IoU.counter: 0.2949, IoU.sand: 0.5599, IoU.sink: 0.7436, IoU.skyscraper: 0.5234, IoU.fireplace: 0.7703, IoU.refrigerator: 0.8576, IoU.grandstand: 0.4643, IoU.path: 0.2488, IoU.stairs: 0.2800, IoU.runway: 0.6819, IoU.case: 0.5154, IoU.pool table: 0.9359, IoU.pillow: 0.6082, IoU.screen door: 0.7745, IoU.stairway: 0.3096, IoU.river: 0.0995, IoU.bridge: 0.7130, IoU.bookcase: 0.4911, IoU.blind: 0.4750, IoU.coffee table: 0.6452, IoU.toilet: 0.8777, IoU.flower: 0.4268, IoU.book: 0.4470, IoU.hill: 0.1282, IoU.bench: 0.4842, IoU.countertop: 0.5561, IoU.stove: 0.8165, IoU.palm: 0.5557, IoU.kitchen island: 0.4807, IoU.computer: 0.7530, IoU.swivel chair: 0.4422, IoU.boat: 0.5376, IoU.bar: 0.4600, IoU.arcade machine: 0.3884, IoU.hovel: 0.4733, IoU.bus: 0.8966, IoU.towel: 0.7227, IoU.light: 0.5840, IoU.truck: 0.4051, IoU.tower: 0.3210, IoU.chandelier: 0.7080, IoU.awning: 0.3833, IoU.streetlight: 0.3528, IoU.booth: 0.4911, IoU.television receiver: 0.6919, IoU.airplane: 0.5577, IoU.dirt track: 0.1179, IoU.apparel: 0.4380, IoU.pole: 0.2776, IoU.land: 0.0496, IoU.bannister: 0.1560, IoU.escalator: 0.5039, IoU.ottoman: 0.4906, IoU.bottle: 0.3782, IoU.buffet: 0.4062, IoU.poster: 0.2649, IoU.stage: 0.1978, IoU.van: 0.3822, IoU.ship: 0.6586, IoU.fountain: 0.2500, IoU.conveyer belt: 0.7919, IoU.canopy: 0.2737, IoU.washer: 0.7115, IoU.plaything: 0.3261, IoU.swimming pool: 0.5368, IoU.stool: 0.4754, IoU.barrel: 0.2793, IoU.basket: 0.4341, IoU.waterfall: 0.5062, IoU.tent: 0.9230, IoU.bag: 0.1702, IoU.minibike: 0.7197, IoU.cradle: 0.8205, IoU.oven: 0.3801, IoU.ball: 0.5202, IoU.food: 0.5700, IoU.step: 0.0901, IoU.tank: 0.5967, IoU.trade name: 0.3100, IoU.microwave: 0.6592, IoU.pot: 0.5256, IoU.animal: 0.6034, IoU.bicycle: 0.5972, IoU.lake: 0.4764, IoU.dishwasher: 0.7421, IoU.screen: 0.5575, IoU.blanket: 0.2292, IoU.sculpture: 0.6949, IoU.hood: 0.7235, IoU.sconce: 0.5437, IoU.vase: 0.4287, IoU.traffic light: 0.3574, IoU.tray: 0.1551, IoU.ashcan: 0.4375, IoU.fan: 0.7127, IoU.pier: 0.3364, IoU.crt screen: 0.0511, IoU.plate: 0.5971, IoU.monitor: 0.1451, IoU.bulletin board: 0.2629, IoU.shower: 0.0979, IoU.radiator: 0.7226, IoU.glass: 0.1635, IoU.clock: 0.4761, IoU.flag: 0.4727, Acc.wall: 0.8868, Acc.building: 0.9194, Acc.sky: 0.9772, Acc.floor: 0.9074, Acc.tree: 0.8890, Acc.ceiling: 0.9405, Acc.road: 0.9027, Acc.bed : 0.9607, Acc.windowpane: 0.7858, Acc.grass: 0.8253, Acc.cabinet: 0.7698, Acc.sidewalk: 0.8510, Acc.person: 0.9352, Acc.earth: 0.4955, Acc.door: 0.6986, Acc.table: 0.7883, Acc.mountain: 0.7194, Acc.plant: 0.6477, Acc.curtain: 0.8637, Acc.chair: 0.8132, Acc.car: 0.9259, Acc.water: 0.6612, Acc.painting: 0.8834, Acc.sofa: 0.8767, Acc.shelf: 0.6294, Acc.house: 0.4653, Acc.sea: 0.8452, Acc.mirror: 0.8210, Acc.rug: 0.5886, Acc.field: 0.4793, Acc.armchair: 0.6363, Acc.seat: 0.8648, Acc.fence: 0.6663, Acc.desk: 0.7621, Acc.rock: 0.7254, Acc.wardrobe: 0.6598, Acc.lamp: 0.7805, Acc.bathtub: 0.8680, Acc.railing: 0.5238, Acc.cushion: 0.7754, Acc.base: 0.5778, Acc.box: 0.3763, Acc.column: 0.6492, Acc.signboard: 0.5814, Acc.chest of drawers: 0.5959, Acc.counter: 0.3777, Acc.sand: 0.8231, Acc.sink: 0.8270, Acc.skyscraper: 0.6627, Acc.fireplace: 0.9163, Acc.refrigerator: 0.9213, Acc.grandstand: 0.7610, Acc.path: 0.3692, Acc.stairs: 0.3665, Acc.runway: 0.8953, Acc.case: 0.7250, Acc.pool table: 0.9695, Acc.pillow: 0.6967, Acc.screen door: 0.8011, Acc.stairway: 0.4256, Acc.river: 0.2511, Acc.bridge: 0.8366, Acc.bookcase: 0.6663, Acc.blind: 0.5302, Acc.coffee table: 0.7775, Acc.toilet: 0.9178, Acc.flower: 0.5900, Acc.book: 0.6366, Acc.hill: 0.1881, Acc.bench: 0.5434, Acc.countertop: 0.7973, Acc.stove: 0.8570, Acc.palm: 0.8291, Acc.kitchen island: 0.7906, Acc.computer: 0.8393, Acc.swivel chair: 0.6146, Acc.boat: 0.5974, Acc.bar: 0.6079, Acc.arcade machine: 0.4188, Acc.hovel: 0.5265, Acc.bus: 0.9698, Acc.towel: 0.8355, Acc.light: 0.6743, Acc.truck: 0.4945, Acc.tower: 0.5484, Acc.chandelier: 0.8301, Acc.awning: 0.4897, Acc.streetlight: 0.4679, Acc.booth: 0.5904, Acc.television receiver: 0.8042, Acc.airplane: 0.7263, Acc.dirt track: 0.4231, Acc.apparel: 0.6091, Acc.pole: 0.4058, Acc.land: 0.0749, Acc.bannister: 0.1977, Acc.escalator: 0.7032, Acc.ottoman: 0.6811, Acc.bottle: 0.5987, Acc.buffet: 0.4758, Acc.poster: 0.3760, Acc.stage: 0.3098, Acc.van: 0.5593, Acc.ship: 0.9666, Acc.fountain: 0.2529, Acc.conveyer belt: 0.9004, Acc.canopy: 0.3165, Acc.washer: 0.7305, Acc.plaything: 0.4544, Acc.swimming pool: 0.7001, Acc.stool: 0.5676, Acc.barrel: 0.7468, Acc.basket: 0.5730, Acc.waterfall: 0.6122, Acc.tent: 0.9781, Acc.bag: 0.2105, Acc.minibike: 0.8984, Acc.cradle: 0.9177, Acc.oven: 0.6865, Acc.ball: 0.6138, Acc.food: 0.6538, Acc.step: 0.1003, Acc.tank: 0.6424, Acc.trade name: 0.4033, Acc.microwave: 0.7298, Acc.pot: 0.5916, Acc.animal: 0.6350, Acc.bicycle: 0.8106, Acc.lake: 0.5503, Acc.dishwasher: 0.8223, Acc.screen: 0.7051, Acc.blanket: 0.2698, Acc.sculpture: 0.8609, Acc.hood: 0.7590, Acc.sconce: 0.6628, Acc.vase: 0.5615, Acc.traffic light: 0.5219, Acc.tray: 0.1918, Acc.ashcan: 0.6158, Acc.fan: 0.8457, Acc.pier: 0.6962, Acc.crt screen: 0.1740, Acc.plate: 0.7904, Acc.monitor: 0.1615, Acc.bulletin board: 0.3106, Acc.shower: 0.2335, Acc.radiator: 0.8363, Acc.glass: 0.1826, Acc.clock: 0.5226, Acc.flag: 0.5549
2023-02-19 14:14:03,911 - mmseg - INFO - Iter [128050/160000]	lr: 1.198e-05, eta: 2:32:15, time: 0.662, data_time: 0.369, memory: 15214, decode.loss_ce: 0.0993, decode.acc_seg: 95.5351, aux.loss_ce: 0.0598, aux.acc_seg: 93.5163, loss: 0.1591, grad_norm: 1.4302
2023-02-19 14:14:17,559 - mmseg - INFO - Iter [128100/160000]	lr: 1.196e-05, eta: 2:32:01, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0974, decode.acc_seg: 95.6433, aux.loss_ce: 0.0575, aux.acc_seg: 93.7975, loss: 0.1548, grad_norm: 1.2455
2023-02-19 14:14:31,578 - mmseg - INFO - Iter [128150/160000]	lr: 1.194e-05, eta: 2:31:46, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0979, decode.acc_seg: 95.7634, aux.loss_ce: 0.0609, aux.acc_seg: 93.4899, loss: 0.1588, grad_norm: 1.2307
2023-02-19 14:14:45,309 - mmseg - INFO - Iter [128200/160000]	lr: 1.193e-05, eta: 2:31:32, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0988, decode.acc_seg: 95.5969, aux.loss_ce: 0.0610, aux.acc_seg: 93.4584, loss: 0.1599, grad_norm: 1.3469
2023-02-19 14:14:59,158 - mmseg - INFO - Iter [128250/160000]	lr: 1.191e-05, eta: 2:31:17, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0999, decode.acc_seg: 95.6595, aux.loss_ce: 0.0592, aux.acc_seg: 93.6957, loss: 0.1592, grad_norm: 1.3099
2023-02-19 14:15:12,867 - mmseg - INFO - Iter [128300/160000]	lr: 1.189e-05, eta: 2:31:03, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1018, decode.acc_seg: 95.4438, aux.loss_ce: 0.0590, aux.acc_seg: 93.5592, loss: 0.1608, grad_norm: 1.6710
2023-02-19 14:15:27,755 - mmseg - INFO - Iter [128350/160000]	lr: 1.187e-05, eta: 2:30:49, time: 0.298, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1034, decode.acc_seg: 95.3791, aux.loss_ce: 0.0627, aux.acc_seg: 93.2528, loss: 0.1661, grad_norm: 1.7265
2023-02-19 14:15:41,621 - mmseg - INFO - Iter [128400/160000]	lr: 1.185e-05, eta: 2:30:34, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1023, decode.acc_seg: 95.4784, aux.loss_ce: 0.0595, aux.acc_seg: 93.6309, loss: 0.1617, grad_norm: 1.2032
2023-02-19 14:15:55,237 - mmseg - INFO - Iter [128450/160000]	lr: 1.183e-05, eta: 2:30:20, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1021, decode.acc_seg: 95.5266, aux.loss_ce: 0.0594, aux.acc_seg: 93.6086, loss: 0.1615, grad_norm: 2.0268
2023-02-19 14:16:08,865 - mmseg - INFO - Iter [128500/160000]	lr: 1.181e-05, eta: 2:30:05, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1033, decode.acc_seg: 95.4033, aux.loss_ce: 0.0600, aux.acc_seg: 93.4687, loss: 0.1633, grad_norm: 1.4385
2023-02-19 14:16:22,804 - mmseg - INFO - Iter [128550/160000]	lr: 1.179e-05, eta: 2:29:51, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1029, decode.acc_seg: 95.3954, aux.loss_ce: 0.0601, aux.acc_seg: 93.4684, loss: 0.1630, grad_norm: 1.7852
2023-02-19 14:16:36,827 - mmseg - INFO - Iter [128600/160000]	lr: 1.178e-05, eta: 2:29:37, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0961, decode.acc_seg: 95.7008, aux.loss_ce: 0.0576, aux.acc_seg: 93.7344, loss: 0.1537, grad_norm: 1.0940
2023-02-19 14:16:51,573 - mmseg - INFO - Iter [128650/160000]	lr: 1.176e-05, eta: 2:29:22, time: 0.296, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0987, decode.acc_seg: 95.5929, aux.loss_ce: 0.0600, aux.acc_seg: 93.4856, loss: 0.1588, grad_norm: 1.1035
2023-02-19 14:17:05,551 - mmseg - INFO - Iter [128700/160000]	lr: 1.174e-05, eta: 2:29:08, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1009, decode.acc_seg: 95.5505, aux.loss_ce: 0.0625, aux.acc_seg: 93.4008, loss: 0.1634, grad_norm: 1.5246
2023-02-19 14:17:20,604 - mmseg - INFO - Iter [128750/160000]	lr: 1.172e-05, eta: 2:28:54, time: 0.301, data_time: 0.006, memory: 15214, decode.loss_ce: 0.1022, decode.acc_seg: 95.5266, aux.loss_ce: 0.0617, aux.acc_seg: 93.4192, loss: 0.1639, grad_norm: 1.4799
2023-02-19 14:17:34,168 - mmseg - INFO - Iter [128800/160000]	lr: 1.170e-05, eta: 2:28:40, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1007, decode.acc_seg: 95.5543, aux.loss_ce: 0.0593, aux.acc_seg: 93.5765, loss: 0.1600, grad_norm: 1.2729
2023-02-19 14:17:50,010 - mmseg - INFO - Iter [128850/160000]	lr: 1.168e-05, eta: 2:28:26, time: 0.317, data_time: 0.048, memory: 15214, decode.loss_ce: 0.0957, decode.acc_seg: 95.7579, aux.loss_ce: 0.0577, aux.acc_seg: 93.8072, loss: 0.1533, grad_norm: 1.4504
2023-02-19 14:18:03,791 - mmseg - INFO - Iter [128900/160000]	lr: 1.166e-05, eta: 2:28:11, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1039, decode.acc_seg: 95.3019, aux.loss_ce: 0.0605, aux.acc_seg: 93.3314, loss: 0.1644, grad_norm: 1.1453
2023-02-19 14:18:17,879 - mmseg - INFO - Iter [128950/160000]	lr: 1.164e-05, eta: 2:27:57, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1012, decode.acc_seg: 95.4487, aux.loss_ce: 0.0610, aux.acc_seg: 93.3366, loss: 0.1622, grad_norm: 1.3197
2023-02-19 14:18:31,834 - mmseg - INFO - Saving checkpoint at 129000 iterations
2023-02-19 14:18:35,196 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:18:35,197 - mmseg - INFO - Iter [129000/160000]	lr: 1.163e-05, eta: 2:27:43, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1013, decode.acc_seg: 95.5085, aux.loss_ce: 0.0614, aux.acc_seg: 93.3851, loss: 0.1626, grad_norm: 1.3109
2023-02-19 14:18:49,117 - mmseg - INFO - Iter [129050/160000]	lr: 1.161e-05, eta: 2:27:29, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0996, decode.acc_seg: 95.6158, aux.loss_ce: 0.0583, aux.acc_seg: 93.7657, loss: 0.1580, grad_norm: 1.2521
2023-02-19 14:19:03,256 - mmseg - INFO - Iter [129100/160000]	lr: 1.159e-05, eta: 2:27:15, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1012, decode.acc_seg: 95.5161, aux.loss_ce: 0.0601, aux.acc_seg: 93.4068, loss: 0.1614, grad_norm: 1.4917
2023-02-19 14:19:16,990 - mmseg - INFO - Iter [129150/160000]	lr: 1.157e-05, eta: 2:27:00, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1012, decode.acc_seg: 95.4518, aux.loss_ce: 0.0599, aux.acc_seg: 93.4584, loss: 0.1611, grad_norm: 1.6025
2023-02-19 14:19:30,634 - mmseg - INFO - Iter [129200/160000]	lr: 1.155e-05, eta: 2:26:46, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1000, decode.acc_seg: 95.6636, aux.loss_ce: 0.0594, aux.acc_seg: 93.6040, loss: 0.1594, grad_norm: 1.4017
2023-02-19 14:19:44,515 - mmseg - INFO - Iter [129250/160000]	lr: 1.153e-05, eta: 2:26:31, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0967, decode.acc_seg: 95.6177, aux.loss_ce: 0.0573, aux.acc_seg: 93.6797, loss: 0.1541, grad_norm: 1.3129
2023-02-19 14:19:58,855 - mmseg - INFO - Iter [129300/160000]	lr: 1.151e-05, eta: 2:26:17, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0969, decode.acc_seg: 95.5717, aux.loss_ce: 0.0569, aux.acc_seg: 93.6846, loss: 0.1537, grad_norm: 1.1758
2023-02-19 14:20:12,744 - mmseg - INFO - Iter [129350/160000]	lr: 1.149e-05, eta: 2:26:03, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0982, decode.acc_seg: 95.5463, aux.loss_ce: 0.0593, aux.acc_seg: 93.5197, loss: 0.1575, grad_norm: 1.4078
2023-02-19 14:20:26,773 - mmseg - INFO - Iter [129400/160000]	lr: 1.148e-05, eta: 2:25:48, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0986, decode.acc_seg: 95.6296, aux.loss_ce: 0.0590, aux.acc_seg: 93.6439, loss: 0.1576, grad_norm: 1.3353
2023-02-19 14:20:40,476 - mmseg - INFO - Iter [129450/160000]	lr: 1.146e-05, eta: 2:25:34, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1003, decode.acc_seg: 95.6337, aux.loss_ce: 0.0606, aux.acc_seg: 93.5444, loss: 0.1609, grad_norm: 2.3751
2023-02-19 14:20:55,212 - mmseg - INFO - Iter [129500/160000]	lr: 1.144e-05, eta: 2:25:20, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0962, decode.acc_seg: 95.7077, aux.loss_ce: 0.0576, aux.acc_seg: 93.7714, loss: 0.1538, grad_norm: 1.2466
2023-02-19 14:21:08,990 - mmseg - INFO - Iter [129550/160000]	lr: 1.142e-05, eta: 2:25:05, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1045, decode.acc_seg: 95.5060, aux.loss_ce: 0.0602, aux.acc_seg: 93.6803, loss: 0.1647, grad_norm: 1.8020
2023-02-19 14:21:22,781 - mmseg - INFO - Iter [129600/160000]	lr: 1.140e-05, eta: 2:24:51, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1015, decode.acc_seg: 95.5041, aux.loss_ce: 0.0579, aux.acc_seg: 93.7877, loss: 0.1594, grad_norm: 1.4196
2023-02-19 14:21:36,980 - mmseg - INFO - Iter [129650/160000]	lr: 1.138e-05, eta: 2:24:36, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1003, decode.acc_seg: 95.4591, aux.loss_ce: 0.0584, aux.acc_seg: 93.6146, loss: 0.1586, grad_norm: 1.4912
2023-02-19 14:21:50,856 - mmseg - INFO - Iter [129700/160000]	lr: 1.136e-05, eta: 2:24:22, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0993, decode.acc_seg: 95.6299, aux.loss_ce: 0.0602, aux.acc_seg: 93.5189, loss: 0.1595, grad_norm: 1.2302
2023-02-19 14:22:05,682 - mmseg - INFO - Iter [129750/160000]	lr: 1.134e-05, eta: 2:24:08, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1037, decode.acc_seg: 95.3926, aux.loss_ce: 0.0603, aux.acc_seg: 93.5268, loss: 0.1640, grad_norm: 1.6064
2023-02-19 14:22:20,358 - mmseg - INFO - Iter [129800/160000]	lr: 1.133e-05, eta: 2:23:54, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0955, decode.acc_seg: 95.8099, aux.loss_ce: 0.0551, aux.acc_seg: 94.1364, loss: 0.1506, grad_norm: 1.2031
2023-02-19 14:22:33,943 - mmseg - INFO - Iter [129850/160000]	lr: 1.131e-05, eta: 2:23:39, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0945, decode.acc_seg: 95.6991, aux.loss_ce: 0.0577, aux.acc_seg: 93.6039, loss: 0.1523, grad_norm: 1.2359
2023-02-19 14:22:47,734 - mmseg - INFO - Iter [129900/160000]	lr: 1.129e-05, eta: 2:23:25, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1030, decode.acc_seg: 95.3862, aux.loss_ce: 0.0604, aux.acc_seg: 93.3777, loss: 0.1635, grad_norm: 1.5506
2023-02-19 14:23:01,607 - mmseg - INFO - Iter [129950/160000]	lr: 1.127e-05, eta: 2:23:10, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1010, decode.acc_seg: 95.5249, aux.loss_ce: 0.0581, aux.acc_seg: 93.7074, loss: 0.1592, grad_norm: 1.2441
2023-02-19 14:23:15,422 - mmseg - INFO - Saving checkpoint at 130000 iterations
2023-02-19 14:23:18,654 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:23:18,655 - mmseg - INFO - Iter [130000/160000]	lr: 1.125e-05, eta: 2:22:57, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0960, decode.acc_seg: 95.7174, aux.loss_ce: 0.0587, aux.acc_seg: 93.7903, loss: 0.1548, grad_norm: 1.1817
2023-02-19 14:23:32,409 - mmseg - INFO - Iter [130050/160000]	lr: 1.123e-05, eta: 2:22:42, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0974, decode.acc_seg: 95.7266, aux.loss_ce: 0.0581, aux.acc_seg: 93.7214, loss: 0.1556, grad_norm: 1.5428
2023-02-19 14:23:48,651 - mmseg - INFO - Iter [130100/160000]	lr: 1.121e-05, eta: 2:22:28, time: 0.325, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0984, decode.acc_seg: 95.5757, aux.loss_ce: 0.0594, aux.acc_seg: 93.5415, loss: 0.1578, grad_norm: 1.5046
2023-02-19 14:24:02,418 - mmseg - INFO - Iter [130150/160000]	lr: 1.119e-05, eta: 2:22:14, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0972, decode.acc_seg: 95.7116, aux.loss_ce: 0.0580, aux.acc_seg: 93.7371, loss: 0.1551, grad_norm: 1.4946
2023-02-19 14:24:16,028 - mmseg - INFO - Iter [130200/160000]	lr: 1.118e-05, eta: 2:22:00, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0973, decode.acc_seg: 95.5689, aux.loss_ce: 0.0592, aux.acc_seg: 93.4892, loss: 0.1564, grad_norm: 1.3983
2023-02-19 14:24:29,662 - mmseg - INFO - Iter [130250/160000]	lr: 1.116e-05, eta: 2:21:45, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0990, decode.acc_seg: 95.5847, aux.loss_ce: 0.0604, aux.acc_seg: 93.5131, loss: 0.1594, grad_norm: 1.4016
2023-02-19 14:24:43,634 - mmseg - INFO - Iter [130300/160000]	lr: 1.114e-05, eta: 2:21:31, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0965, decode.acc_seg: 95.7440, aux.loss_ce: 0.0580, aux.acc_seg: 93.7857, loss: 0.1544, grad_norm: 1.2902
2023-02-19 14:24:57,875 - mmseg - INFO - Iter [130350/160000]	lr: 1.112e-05, eta: 2:21:16, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0944, decode.acc_seg: 95.7457, aux.loss_ce: 0.0561, aux.acc_seg: 93.8444, loss: 0.1505, grad_norm: 1.2209
2023-02-19 14:25:11,735 - mmseg - INFO - Iter [130400/160000]	lr: 1.110e-05, eta: 2:21:02, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1027, decode.acc_seg: 95.4100, aux.loss_ce: 0.0600, aux.acc_seg: 93.3915, loss: 0.1627, grad_norm: 1.3029
2023-02-19 14:25:26,125 - mmseg - INFO - Iter [130450/160000]	lr: 1.108e-05, eta: 2:20:48, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0995, decode.acc_seg: 95.4904, aux.loss_ce: 0.0604, aux.acc_seg: 93.4416, loss: 0.1599, grad_norm: 1.3271
2023-02-19 14:25:39,749 - mmseg - INFO - Iter [130500/160000]	lr: 1.106e-05, eta: 2:20:33, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0996, decode.acc_seg: 95.4906, aux.loss_ce: 0.0598, aux.acc_seg: 93.3756, loss: 0.1594, grad_norm: 1.5924
2023-02-19 14:25:53,383 - mmseg - INFO - Iter [130550/160000]	lr: 1.104e-05, eta: 2:20:19, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0969, decode.acc_seg: 95.6400, aux.loss_ce: 0.0592, aux.acc_seg: 93.5157, loss: 0.1561, grad_norm: 1.0525
2023-02-19 14:26:07,209 - mmseg - INFO - Iter [130600/160000]	lr: 1.103e-05, eta: 2:20:05, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0962, decode.acc_seg: 95.7761, aux.loss_ce: 0.0592, aux.acc_seg: 93.6885, loss: 0.1554, grad_norm: 1.7675
2023-02-19 14:26:21,306 - mmseg - INFO - Iter [130650/160000]	lr: 1.101e-05, eta: 2:19:50, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1072, decode.acc_seg: 95.2795, aux.loss_ce: 0.0621, aux.acc_seg: 93.3343, loss: 0.1693, grad_norm: 1.5927
2023-02-19 14:26:36,126 - mmseg - INFO - Iter [130700/160000]	lr: 1.099e-05, eta: 2:19:36, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0947, decode.acc_seg: 95.8500, aux.loss_ce: 0.0576, aux.acc_seg: 93.8499, loss: 0.1523, grad_norm: 1.2605
2023-02-19 14:26:49,753 - mmseg - INFO - Iter [130750/160000]	lr: 1.097e-05, eta: 2:19:22, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0968, decode.acc_seg: 95.7085, aux.loss_ce: 0.0583, aux.acc_seg: 93.6618, loss: 0.1551, grad_norm: 1.2234
2023-02-19 14:27:03,484 - mmseg - INFO - Iter [130800/160000]	lr: 1.095e-05, eta: 2:19:07, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1015, decode.acc_seg: 95.5152, aux.loss_ce: 0.0584, aux.acc_seg: 93.7231, loss: 0.1600, grad_norm: 1.4546
2023-02-19 14:27:17,503 - mmseg - INFO - Iter [130850/160000]	lr: 1.093e-05, eta: 2:18:53, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1030, decode.acc_seg: 95.5272, aux.loss_ce: 0.0616, aux.acc_seg: 93.4176, loss: 0.1646, grad_norm: 1.9415
2023-02-19 14:27:31,634 - mmseg - INFO - Iter [130900/160000]	lr: 1.091e-05, eta: 2:18:38, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0976, decode.acc_seg: 95.6996, aux.loss_ce: 0.0572, aux.acc_seg: 93.8779, loss: 0.1548, grad_norm: 1.2358
2023-02-19 14:27:45,818 - mmseg - INFO - Iter [130950/160000]	lr: 1.089e-05, eta: 2:18:24, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0960, decode.acc_seg: 95.7697, aux.loss_ce: 0.0560, aux.acc_seg: 93.9928, loss: 0.1521, grad_norm: 1.1783
2023-02-19 14:27:59,499 - mmseg - INFO - Saving checkpoint at 131000 iterations
2023-02-19 14:28:02,780 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:28:02,781 - mmseg - INFO - Iter [131000/160000]	lr: 1.088e-05, eta: 2:18:10, time: 0.339, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1047, decode.acc_seg: 95.2934, aux.loss_ce: 0.0610, aux.acc_seg: 93.2799, loss: 0.1657, grad_norm: 1.7383
2023-02-19 14:28:16,846 - mmseg - INFO - Iter [131050/160000]	lr: 1.086e-05, eta: 2:17:56, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1024, decode.acc_seg: 95.5475, aux.loss_ce: 0.0594, aux.acc_seg: 93.6771, loss: 0.1618, grad_norm: 1.4908
2023-02-19 14:28:31,143 - mmseg - INFO - Iter [131100/160000]	lr: 1.084e-05, eta: 2:17:42, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1024, decode.acc_seg: 95.4069, aux.loss_ce: 0.0600, aux.acc_seg: 93.4663, loss: 0.1624, grad_norm: 1.1814
2023-02-19 14:28:44,822 - mmseg - INFO - Iter [131150/160000]	lr: 1.082e-05, eta: 2:17:27, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0948, decode.acc_seg: 95.7998, aux.loss_ce: 0.0558, aux.acc_seg: 93.9990, loss: 0.1507, grad_norm: 1.1872
2023-02-19 14:28:58,437 - mmseg - INFO - Iter [131200/160000]	lr: 1.080e-05, eta: 2:17:13, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1053, decode.acc_seg: 95.2596, aux.loss_ce: 0.0595, aux.acc_seg: 93.4218, loss: 0.1648, grad_norm: 1.3412
2023-02-19 14:29:12,126 - mmseg - INFO - Iter [131250/160000]	lr: 1.078e-05, eta: 2:16:59, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0968, decode.acc_seg: 95.6216, aux.loss_ce: 0.0565, aux.acc_seg: 93.8237, loss: 0.1533, grad_norm: 1.2159
2023-02-19 14:29:26,141 - mmseg - INFO - Iter [131300/160000]	lr: 1.076e-05, eta: 2:16:44, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0991, decode.acc_seg: 95.4899, aux.loss_ce: 0.0575, aux.acc_seg: 93.6335, loss: 0.1566, grad_norm: 1.5289
2023-02-19 14:29:40,193 - mmseg - INFO - Iter [131350/160000]	lr: 1.074e-05, eta: 2:16:30, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0994, decode.acc_seg: 95.5176, aux.loss_ce: 0.0590, aux.acc_seg: 93.5096, loss: 0.1584, grad_norm: 1.7294
2023-02-19 14:29:56,597 - mmseg - INFO - Iter [131400/160000]	lr: 1.073e-05, eta: 2:16:16, time: 0.328, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1038, decode.acc_seg: 95.4355, aux.loss_ce: 0.0600, aux.acc_seg: 93.6409, loss: 0.1638, grad_norm: 1.9834
2023-02-19 14:30:10,138 - mmseg - INFO - Iter [131450/160000]	lr: 1.071e-05, eta: 2:16:02, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1039, decode.acc_seg: 95.5228, aux.loss_ce: 0.0631, aux.acc_seg: 93.3359, loss: 0.1671, grad_norm: 1.3423
2023-02-19 14:30:23,882 - mmseg - INFO - Iter [131500/160000]	lr: 1.069e-05, eta: 2:15:47, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0982, decode.acc_seg: 95.6676, aux.loss_ce: 0.0571, aux.acc_seg: 93.8559, loss: 0.1552, grad_norm: 1.1750
2023-02-19 14:30:37,939 - mmseg - INFO - Iter [131550/160000]	lr: 1.067e-05, eta: 2:15:33, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0990, decode.acc_seg: 95.6207, aux.loss_ce: 0.0592, aux.acc_seg: 93.6427, loss: 0.1582, grad_norm: 1.8480
2023-02-19 14:30:51,542 - mmseg - INFO - Iter [131600/160000]	lr: 1.065e-05, eta: 2:15:18, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1081, decode.acc_seg: 95.2634, aux.loss_ce: 0.0604, aux.acc_seg: 93.5063, loss: 0.1685, grad_norm: 2.2226
2023-02-19 14:31:05,206 - mmseg - INFO - Iter [131650/160000]	lr: 1.063e-05, eta: 2:15:04, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0931, decode.acc_seg: 95.7708, aux.loss_ce: 0.0556, aux.acc_seg: 93.8361, loss: 0.1487, grad_norm: 1.1389
2023-02-19 14:31:20,254 - mmseg - INFO - Iter [131700/160000]	lr: 1.061e-05, eta: 2:14:50, time: 0.301, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0999, decode.acc_seg: 95.5876, aux.loss_ce: 0.0581, aux.acc_seg: 93.6385, loss: 0.1579, grad_norm: 1.3743
2023-02-19 14:31:34,272 - mmseg - INFO - Iter [131750/160000]	lr: 1.059e-05, eta: 2:14:35, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0993, decode.acc_seg: 95.5858, aux.loss_ce: 0.0591, aux.acc_seg: 93.5842, loss: 0.1584, grad_norm: 1.7268
2023-02-19 14:31:48,908 - mmseg - INFO - Iter [131800/160000]	lr: 1.058e-05, eta: 2:14:21, time: 0.292, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.8285, aux.loss_ce: 0.0573, aux.acc_seg: 93.7358, loss: 0.1505, grad_norm: 1.0756
2023-02-19 14:32:03,299 - mmseg - INFO - Iter [131850/160000]	lr: 1.056e-05, eta: 2:14:07, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0970, decode.acc_seg: 95.7324, aux.loss_ce: 0.0572, aux.acc_seg: 93.8481, loss: 0.1542, grad_norm: 1.1968
2023-02-19 14:32:18,140 - mmseg - INFO - Iter [131900/160000]	lr: 1.054e-05, eta: 2:13:53, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0957, decode.acc_seg: 95.6795, aux.loss_ce: 0.0593, aux.acc_seg: 93.4640, loss: 0.1550, grad_norm: 1.4279
2023-02-19 14:32:31,962 - mmseg - INFO - Iter [131950/160000]	lr: 1.052e-05, eta: 2:13:38, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0988, decode.acc_seg: 95.5820, aux.loss_ce: 0.0602, aux.acc_seg: 93.4706, loss: 0.1591, grad_norm: 1.6216
2023-02-19 14:32:46,055 - mmseg - INFO - Saving checkpoint at 132000 iterations
2023-02-19 14:32:49,340 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:32:49,340 - mmseg - INFO - Iter [132000/160000]	lr: 1.050e-05, eta: 2:13:25, time: 0.348, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0988, decode.acc_seg: 95.6709, aux.loss_ce: 0.0563, aux.acc_seg: 93.9632, loss: 0.1551, grad_norm: 1.3416
2023-02-19 14:33:03,182 - mmseg - INFO - Iter [132050/160000]	lr: 1.048e-05, eta: 2:13:10, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0982, decode.acc_seg: 95.5910, aux.loss_ce: 0.0584, aux.acc_seg: 93.6427, loss: 0.1566, grad_norm: 1.7623
2023-02-19 14:33:16,951 - mmseg - INFO - Iter [132100/160000]	lr: 1.046e-05, eta: 2:12:56, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1007, decode.acc_seg: 95.5942, aux.loss_ce: 0.0595, aux.acc_seg: 93.6852, loss: 0.1602, grad_norm: 1.1384
2023-02-19 14:33:30,644 - mmseg - INFO - Iter [132150/160000]	lr: 1.044e-05, eta: 2:12:41, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0979, decode.acc_seg: 95.5673, aux.loss_ce: 0.0583, aux.acc_seg: 93.6211, loss: 0.1562, grad_norm: 1.2514
2023-02-19 14:33:45,086 - mmseg - INFO - Iter [132200/160000]	lr: 1.043e-05, eta: 2:12:27, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0966, decode.acc_seg: 95.7427, aux.loss_ce: 0.0573, aux.acc_seg: 93.8470, loss: 0.1539, grad_norm: 1.3218
2023-02-19 14:33:59,528 - mmseg - INFO - Iter [132250/160000]	lr: 1.041e-05, eta: 2:12:13, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0990, decode.acc_seg: 95.5106, aux.loss_ce: 0.0584, aux.acc_seg: 93.6069, loss: 0.1574, grad_norm: 1.2921
2023-02-19 14:34:13,863 - mmseg - INFO - Iter [132300/160000]	lr: 1.039e-05, eta: 2:11:59, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1045, decode.acc_seg: 95.3956, aux.loss_ce: 0.0628, aux.acc_seg: 93.3192, loss: 0.1673, grad_norm: 1.5337
2023-02-19 14:34:27,901 - mmseg - INFO - Iter [132350/160000]	lr: 1.037e-05, eta: 2:11:44, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0986, decode.acc_seg: 95.6258, aux.loss_ce: 0.0578, aux.acc_seg: 93.6382, loss: 0.1565, grad_norm: 1.2298
2023-02-19 14:34:41,595 - mmseg - INFO - Iter [132400/160000]	lr: 1.035e-05, eta: 2:11:30, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0989, decode.acc_seg: 95.5908, aux.loss_ce: 0.0605, aux.acc_seg: 93.4327, loss: 0.1594, grad_norm: 1.4084
2023-02-19 14:34:55,320 - mmseg - INFO - Iter [132450/160000]	lr: 1.033e-05, eta: 2:11:15, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0988, decode.acc_seg: 95.6181, aux.loss_ce: 0.0579, aux.acc_seg: 93.7889, loss: 0.1567, grad_norm: 1.1533
2023-02-19 14:35:09,262 - mmseg - INFO - Iter [132500/160000]	lr: 1.031e-05, eta: 2:11:01, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0957, decode.acc_seg: 95.8891, aux.loss_ce: 0.0582, aux.acc_seg: 93.7926, loss: 0.1539, grad_norm: 1.3599
2023-02-19 14:35:23,767 - mmseg - INFO - Iter [132550/160000]	lr: 1.029e-05, eta: 2:10:47, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1028, decode.acc_seg: 95.5360, aux.loss_ce: 0.0607, aux.acc_seg: 93.5307, loss: 0.1635, grad_norm: 1.2369
2023-02-19 14:35:37,473 - mmseg - INFO - Iter [132600/160000]	lr: 1.028e-05, eta: 2:10:32, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1006, decode.acc_seg: 95.4396, aux.loss_ce: 0.0586, aux.acc_seg: 93.5426, loss: 0.1592, grad_norm: 1.2661
2023-02-19 14:35:53,322 - mmseg - INFO - Iter [132650/160000]	lr: 1.026e-05, eta: 2:10:18, time: 0.317, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0939, decode.acc_seg: 95.8418, aux.loss_ce: 0.0560, aux.acc_seg: 93.9664, loss: 0.1498, grad_norm: 1.0386
2023-02-19 14:36:07,430 - mmseg - INFO - Iter [132700/160000]	lr: 1.024e-05, eta: 2:10:04, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0978, decode.acc_seg: 95.5431, aux.loss_ce: 0.0584, aux.acc_seg: 93.5332, loss: 0.1562, grad_norm: 1.4670
2023-02-19 14:36:21,766 - mmseg - INFO - Iter [132750/160000]	lr: 1.022e-05, eta: 2:09:50, time: 0.287, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0967, decode.acc_seg: 95.6864, aux.loss_ce: 0.0584, aux.acc_seg: 93.5688, loss: 0.1552, grad_norm: 1.1404
2023-02-19 14:36:35,780 - mmseg - INFO - Iter [132800/160000]	lr: 1.020e-05, eta: 2:09:36, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0982, decode.acc_seg: 95.7550, aux.loss_ce: 0.0586, aux.acc_seg: 93.7688, loss: 0.1568, grad_norm: 1.2786
2023-02-19 14:36:49,521 - mmseg - INFO - Iter [132850/160000]	lr: 1.018e-05, eta: 2:09:21, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1008, decode.acc_seg: 95.5229, aux.loss_ce: 0.0597, aux.acc_seg: 93.3857, loss: 0.1605, grad_norm: 1.3260
2023-02-19 14:37:03,666 - mmseg - INFO - Iter [132900/160000]	lr: 1.016e-05, eta: 2:09:07, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.7972, aux.loss_ce: 0.0557, aux.acc_seg: 93.8985, loss: 0.1499, grad_norm: 1.4588
2023-02-19 14:37:17,334 - mmseg - INFO - Iter [132950/160000]	lr: 1.014e-05, eta: 2:08:52, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0913, decode.acc_seg: 95.8526, aux.loss_ce: 0.0570, aux.acc_seg: 93.8341, loss: 0.1483, grad_norm: 1.4910
2023-02-19 14:37:31,509 - mmseg - INFO - Saving checkpoint at 133000 iterations
2023-02-19 14:37:34,765 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:37:34,766 - mmseg - INFO - Iter [133000/160000]	lr: 1.013e-05, eta: 2:08:39, time: 0.349, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1043, decode.acc_seg: 95.3798, aux.loss_ce: 0.0609, aux.acc_seg: 93.3547, loss: 0.1652, grad_norm: 1.6412
2023-02-19 14:37:49,061 - mmseg - INFO - Iter [133050/160000]	lr: 1.011e-05, eta: 2:08:24, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0975, decode.acc_seg: 95.5775, aux.loss_ce: 0.0585, aux.acc_seg: 93.6020, loss: 0.1560, grad_norm: 1.3045
2023-02-19 14:38:03,283 - mmseg - INFO - Iter [133100/160000]	lr: 1.009e-05, eta: 2:08:10, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1047, decode.acc_seg: 95.4050, aux.loss_ce: 0.0613, aux.acc_seg: 93.4498, loss: 0.1660, grad_norm: 1.7551
2023-02-19 14:38:16,870 - mmseg - INFO - Iter [133150/160000]	lr: 1.007e-05, eta: 2:07:56, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0988, decode.acc_seg: 95.6109, aux.loss_ce: 0.0588, aux.acc_seg: 93.6393, loss: 0.1575, grad_norm: 1.3241
2023-02-19 14:38:31,032 - mmseg - INFO - Iter [133200/160000]	lr: 1.005e-05, eta: 2:07:41, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0988, decode.acc_seg: 95.5390, aux.loss_ce: 0.0583, aux.acc_seg: 93.5422, loss: 0.1571, grad_norm: 1.3385
2023-02-19 14:38:45,071 - mmseg - INFO - Iter [133250/160000]	lr: 1.003e-05, eta: 2:07:27, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0994, decode.acc_seg: 95.6681, aux.loss_ce: 0.0605, aux.acc_seg: 93.5449, loss: 0.1599, grad_norm: 1.2064
2023-02-19 14:38:58,739 - mmseg - INFO - Iter [133300/160000]	lr: 1.001e-05, eta: 2:07:13, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1009, decode.acc_seg: 95.4976, aux.loss_ce: 0.0621, aux.acc_seg: 93.2968, loss: 0.1629, grad_norm: 1.5665
2023-02-19 14:39:12,738 - mmseg - INFO - Iter [133350/160000]	lr: 9.994e-06, eta: 2:06:58, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0952, decode.acc_seg: 95.7753, aux.loss_ce: 0.0588, aux.acc_seg: 93.5082, loss: 0.1540, grad_norm: 1.5191
2023-02-19 14:39:27,383 - mmseg - INFO - Iter [133400/160000]	lr: 9.975e-06, eta: 2:06:44, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0995, decode.acc_seg: 95.6157, aux.loss_ce: 0.0587, aux.acc_seg: 93.6681, loss: 0.1582, grad_norm: 1.2804
2023-02-19 14:39:41,009 - mmseg - INFO - Iter [133450/160000]	lr: 9.957e-06, eta: 2:06:30, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0937, decode.acc_seg: 95.8336, aux.loss_ce: 0.0561, aux.acc_seg: 93.8734, loss: 0.1498, grad_norm: 1.8421
2023-02-19 14:39:54,820 - mmseg - INFO - Iter [133500/160000]	lr: 9.938e-06, eta: 2:06:15, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0969, decode.acc_seg: 95.7516, aux.loss_ce: 0.0575, aux.acc_seg: 93.8476, loss: 0.1544, grad_norm: 1.3480
2023-02-19 14:40:09,027 - mmseg - INFO - Iter [133550/160000]	lr: 9.919e-06, eta: 2:06:01, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0970, decode.acc_seg: 95.7154, aux.loss_ce: 0.0555, aux.acc_seg: 93.9901, loss: 0.1526, grad_norm: 1.1343
2023-02-19 14:40:23,214 - mmseg - INFO - Iter [133600/160000]	lr: 9.900e-06, eta: 2:05:47, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1036, decode.acc_seg: 95.4248, aux.loss_ce: 0.0623, aux.acc_seg: 93.3326, loss: 0.1659, grad_norm: 1.5669
2023-02-19 14:40:36,922 - mmseg - INFO - Iter [133650/160000]	lr: 9.882e-06, eta: 2:05:32, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0924, decode.acc_seg: 95.8065, aux.loss_ce: 0.0533, aux.acc_seg: 94.0872, loss: 0.1457, grad_norm: 1.2904
2023-02-19 14:40:50,728 - mmseg - INFO - Iter [133700/160000]	lr: 9.863e-06, eta: 2:05:18, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0956, decode.acc_seg: 95.6882, aux.loss_ce: 0.0597, aux.acc_seg: 93.4425, loss: 0.1554, grad_norm: 1.7288
2023-02-19 14:41:05,547 - mmseg - INFO - Iter [133750/160000]	lr: 9.844e-06, eta: 2:05:04, time: 0.296, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0964, decode.acc_seg: 95.7172, aux.loss_ce: 0.0590, aux.acc_seg: 93.6442, loss: 0.1554, grad_norm: 1.4402
2023-02-19 14:41:20,057 - mmseg - INFO - Iter [133800/160000]	lr: 9.825e-06, eta: 2:04:49, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0970, decode.acc_seg: 95.6686, aux.loss_ce: 0.0569, aux.acc_seg: 93.8197, loss: 0.1539, grad_norm: 1.1150
2023-02-19 14:41:33,665 - mmseg - INFO - Iter [133850/160000]	lr: 9.807e-06, eta: 2:04:35, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0983, decode.acc_seg: 95.6493, aux.loss_ce: 0.0579, aux.acc_seg: 93.6279, loss: 0.1562, grad_norm: 1.9808
2023-02-19 14:41:49,698 - mmseg - INFO - Iter [133900/160000]	lr: 9.788e-06, eta: 2:04:21, time: 0.321, data_time: 0.048, memory: 15214, decode.loss_ce: 0.1027, decode.acc_seg: 95.4437, aux.loss_ce: 0.0649, aux.acc_seg: 93.0703, loss: 0.1676, grad_norm: 1.4405
2023-02-19 14:42:03,633 - mmseg - INFO - Iter [133950/160000]	lr: 9.769e-06, eta: 2:04:07, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1014, decode.acc_seg: 95.5459, aux.loss_ce: 0.0606, aux.acc_seg: 93.5500, loss: 0.1619, grad_norm: 1.2521
2023-02-19 14:42:17,561 - mmseg - INFO - Saving checkpoint at 134000 iterations
2023-02-19 14:42:20,806 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:42:20,807 - mmseg - INFO - Iter [134000/160000]	lr: 9.750e-06, eta: 2:03:53, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0996, decode.acc_seg: 95.5299, aux.loss_ce: 0.0593, aux.acc_seg: 93.5196, loss: 0.1590, grad_norm: 1.2865
2023-02-19 14:42:35,063 - mmseg - INFO - Iter [134050/160000]	lr: 9.732e-06, eta: 2:03:39, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0970, decode.acc_seg: 95.5481, aux.loss_ce: 0.0571, aux.acc_seg: 93.6669, loss: 0.1542, grad_norm: 1.2291
2023-02-19 14:42:49,124 - mmseg - INFO - Iter [134100/160000]	lr: 9.713e-06, eta: 2:03:24, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1016, decode.acc_seg: 95.5248, aux.loss_ce: 0.0617, aux.acc_seg: 93.2906, loss: 0.1633, grad_norm: 1.9291
2023-02-19 14:43:03,316 - mmseg - INFO - Iter [134150/160000]	lr: 9.694e-06, eta: 2:03:10, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0908, decode.acc_seg: 95.9707, aux.loss_ce: 0.0546, aux.acc_seg: 94.0766, loss: 0.1453, grad_norm: 1.1685
2023-02-19 14:43:17,551 - mmseg - INFO - Iter [134200/160000]	lr: 9.675e-06, eta: 2:02:56, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0993, decode.acc_seg: 95.6153, aux.loss_ce: 0.0586, aux.acc_seg: 93.6158, loss: 0.1579, grad_norm: 1.1204
2023-02-19 14:43:31,448 - mmseg - INFO - Iter [134250/160000]	lr: 9.657e-06, eta: 2:02:41, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0962, decode.acc_seg: 95.7672, aux.loss_ce: 0.0585, aux.acc_seg: 93.7481, loss: 0.1547, grad_norm: 1.7449
2023-02-19 14:43:45,340 - mmseg - INFO - Iter [134300/160000]	lr: 9.638e-06, eta: 2:02:27, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0975, decode.acc_seg: 95.6541, aux.loss_ce: 0.0574, aux.acc_seg: 93.7240, loss: 0.1549, grad_norm: 1.1697
2023-02-19 14:43:59,047 - mmseg - INFO - Iter [134350/160000]	lr: 9.619e-06, eta: 2:02:12, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0936, decode.acc_seg: 95.7537, aux.loss_ce: 0.0576, aux.acc_seg: 93.6280, loss: 0.1512, grad_norm: 1.3389
2023-02-19 14:44:12,631 - mmseg - INFO - Iter [134400/160000]	lr: 9.600e-06, eta: 2:01:58, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0977, decode.acc_seg: 95.7016, aux.loss_ce: 0.0596, aux.acc_seg: 93.5221, loss: 0.1573, grad_norm: 1.6466
2023-02-19 14:44:26,611 - mmseg - INFO - Iter [134450/160000]	lr: 9.582e-06, eta: 2:01:44, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1011, decode.acc_seg: 95.5998, aux.loss_ce: 0.0604, aux.acc_seg: 93.5087, loss: 0.1616, grad_norm: 2.1020
2023-02-19 14:44:40,244 - mmseg - INFO - Iter [134500/160000]	lr: 9.563e-06, eta: 2:01:29, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0948, decode.acc_seg: 95.8017, aux.loss_ce: 0.0565, aux.acc_seg: 93.9027, loss: 0.1513, grad_norm: 2.1808
2023-02-19 14:44:54,153 - mmseg - INFO - Iter [134550/160000]	lr: 9.544e-06, eta: 2:01:15, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1014, decode.acc_seg: 95.5531, aux.loss_ce: 0.0595, aux.acc_seg: 93.6553, loss: 0.1609, grad_norm: 1.2137
2023-02-19 14:45:08,099 - mmseg - INFO - Iter [134600/160000]	lr: 9.525e-06, eta: 2:01:00, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0995, decode.acc_seg: 95.6343, aux.loss_ce: 0.0610, aux.acc_seg: 93.4809, loss: 0.1606, grad_norm: 1.2584
2023-02-19 14:45:22,351 - mmseg - INFO - Iter [134650/160000]	lr: 9.507e-06, eta: 2:00:46, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0955, decode.acc_seg: 95.6469, aux.loss_ce: 0.0573, aux.acc_seg: 93.7088, loss: 0.1527, grad_norm: 1.2637
2023-02-19 14:45:36,012 - mmseg - INFO - Iter [134700/160000]	lr: 9.488e-06, eta: 2:00:32, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0933, decode.acc_seg: 95.8074, aux.loss_ce: 0.0573, aux.acc_seg: 93.7767, loss: 0.1506, grad_norm: 2.4989
2023-02-19 14:45:49,664 - mmseg - INFO - Iter [134750/160000]	lr: 9.469e-06, eta: 2:00:17, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0947, decode.acc_seg: 95.8188, aux.loss_ce: 0.0563, aux.acc_seg: 93.9733, loss: 0.1510, grad_norm: 1.1789
2023-02-19 14:46:03,434 - mmseg - INFO - Iter [134800/160000]	lr: 9.450e-06, eta: 2:00:03, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0955, decode.acc_seg: 95.7660, aux.loss_ce: 0.0573, aux.acc_seg: 93.7693, loss: 0.1528, grad_norm: 1.2800
2023-02-19 14:46:17,174 - mmseg - INFO - Iter [134850/160000]	lr: 9.432e-06, eta: 1:59:49, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0944, decode.acc_seg: 95.7491, aux.loss_ce: 0.0581, aux.acc_seg: 93.6527, loss: 0.1525, grad_norm: 1.2048
2023-02-19 14:46:31,264 - mmseg - INFO - Iter [134900/160000]	lr: 9.413e-06, eta: 1:59:34, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1027, decode.acc_seg: 95.5128, aux.loss_ce: 0.0620, aux.acc_seg: 93.3597, loss: 0.1646, grad_norm: 1.3715
2023-02-19 14:46:44,993 - mmseg - INFO - Iter [134950/160000]	lr: 9.394e-06, eta: 1:59:20, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0911, decode.acc_seg: 95.8462, aux.loss_ce: 0.0560, aux.acc_seg: 93.7884, loss: 0.1471, grad_norm: 1.2288
2023-02-19 14:46:58,807 - mmseg - INFO - Saving checkpoint at 135000 iterations
2023-02-19 14:47:02,121 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:47:02,121 - mmseg - INFO - Iter [135000/160000]	lr: 9.375e-06, eta: 1:59:06, time: 0.343, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0953, decode.acc_seg: 95.7234, aux.loss_ce: 0.0566, aux.acc_seg: 93.8715, loss: 0.1519, grad_norm: 1.1023
2023-02-19 14:47:15,992 - mmseg - INFO - Iter [135050/160000]	lr: 9.357e-06, eta: 1:58:52, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0956, decode.acc_seg: 95.8079, aux.loss_ce: 0.0562, aux.acc_seg: 93.9944, loss: 0.1519, grad_norm: 1.8837
2023-02-19 14:47:29,856 - mmseg - INFO - Iter [135100/160000]	lr: 9.338e-06, eta: 1:58:37, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0991, decode.acc_seg: 95.5145, aux.loss_ce: 0.0608, aux.acc_seg: 93.3763, loss: 0.1599, grad_norm: 1.6306
2023-02-19 14:47:46,945 - mmseg - INFO - Iter [135150/160000]	lr: 9.319e-06, eta: 1:58:24, time: 0.342, data_time: 0.049, memory: 15214, decode.loss_ce: 0.0975, decode.acc_seg: 95.7273, aux.loss_ce: 0.0574, aux.acc_seg: 93.9115, loss: 0.1549, grad_norm: 1.2009
2023-02-19 14:48:00,623 - mmseg - INFO - Iter [135200/160000]	lr: 9.300e-06, eta: 1:58:09, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0996, decode.acc_seg: 95.5972, aux.loss_ce: 0.0588, aux.acc_seg: 93.7352, loss: 0.1584, grad_norm: 1.0899
2023-02-19 14:48:15,112 - mmseg - INFO - Iter [135250/160000]	lr: 9.282e-06, eta: 1:57:55, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0945, decode.acc_seg: 95.6214, aux.loss_ce: 0.0560, aux.acc_seg: 93.7846, loss: 0.1506, grad_norm: 1.1318
2023-02-19 14:48:29,112 - mmseg - INFO - Iter [135300/160000]	lr: 9.263e-06, eta: 1:57:41, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1018, decode.acc_seg: 95.5930, aux.loss_ce: 0.0603, aux.acc_seg: 93.6011, loss: 0.1621, grad_norm: 1.5933
2023-02-19 14:48:42,870 - mmseg - INFO - Iter [135350/160000]	lr: 9.244e-06, eta: 1:57:26, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0975, decode.acc_seg: 95.5962, aux.loss_ce: 0.0563, aux.acc_seg: 93.8327, loss: 0.1538, grad_norm: 1.0839
2023-02-19 14:48:56,665 - mmseg - INFO - Iter [135400/160000]	lr: 9.225e-06, eta: 1:57:12, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0959, decode.acc_seg: 95.7549, aux.loss_ce: 0.0560, aux.acc_seg: 93.9593, loss: 0.1519, grad_norm: 1.9422
2023-02-19 14:49:10,448 - mmseg - INFO - Iter [135450/160000]	lr: 9.207e-06, eta: 1:56:57, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0967, decode.acc_seg: 95.6598, aux.loss_ce: 0.0575, aux.acc_seg: 93.6799, loss: 0.1543, grad_norm: 1.1559
2023-02-19 14:49:24,135 - mmseg - INFO - Iter [135500/160000]	lr: 9.188e-06, eta: 1:56:43, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0946, decode.acc_seg: 95.7819, aux.loss_ce: 0.0584, aux.acc_seg: 93.7767, loss: 0.1530, grad_norm: 1.4736
2023-02-19 14:49:38,223 - mmseg - INFO - Iter [135550/160000]	lr: 9.169e-06, eta: 1:56:29, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0936, decode.acc_seg: 95.9160, aux.loss_ce: 0.0563, aux.acc_seg: 93.9696, loss: 0.1499, grad_norm: 1.3182
2023-02-19 14:49:52,014 - mmseg - INFO - Iter [135600/160000]	lr: 9.150e-06, eta: 1:56:14, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0970, decode.acc_seg: 95.6372, aux.loss_ce: 0.0578, aux.acc_seg: 93.6467, loss: 0.1547, grad_norm: 1.1283
2023-02-19 14:50:05,633 - mmseg - INFO - Iter [135650/160000]	lr: 9.132e-06, eta: 1:56:00, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0949, decode.acc_seg: 95.6462, aux.loss_ce: 0.0560, aux.acc_seg: 93.7313, loss: 0.1509, grad_norm: 1.1087
2023-02-19 14:50:19,963 - mmseg - INFO - Iter [135700/160000]	lr: 9.113e-06, eta: 1:55:46, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0960, decode.acc_seg: 95.7390, aux.loss_ce: 0.0577, aux.acc_seg: 93.6592, loss: 0.1538, grad_norm: 1.3040
2023-02-19 14:50:33,889 - mmseg - INFO - Iter [135750/160000]	lr: 9.094e-06, eta: 1:55:31, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1006, decode.acc_seg: 95.4306, aux.loss_ce: 0.0589, aux.acc_seg: 93.4486, loss: 0.1595, grad_norm: 1.9482
2023-02-19 14:50:48,035 - mmseg - INFO - Iter [135800/160000]	lr: 9.075e-06, eta: 1:55:17, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1003, decode.acc_seg: 95.5923, aux.loss_ce: 0.0610, aux.acc_seg: 93.5108, loss: 0.1613, grad_norm: 1.7460
2023-02-19 14:51:03,157 - mmseg - INFO - Iter [135850/160000]	lr: 9.057e-06, eta: 1:55:03, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0949, decode.acc_seg: 95.7803, aux.loss_ce: 0.0560, aux.acc_seg: 93.9157, loss: 0.1509, grad_norm: 1.4184
2023-02-19 14:51:16,939 - mmseg - INFO - Iter [135900/160000]	lr: 9.038e-06, eta: 1:54:48, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0952, decode.acc_seg: 95.7928, aux.loss_ce: 0.0571, aux.acc_seg: 93.7706, loss: 0.1523, grad_norm: 1.3873
2023-02-19 14:51:31,166 - mmseg - INFO - Iter [135950/160000]	lr: 9.019e-06, eta: 1:54:34, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0966, decode.acc_seg: 95.7676, aux.loss_ce: 0.0565, aux.acc_seg: 93.8851, loss: 0.1530, grad_norm: 1.0634
2023-02-19 14:51:45,115 - mmseg - INFO - Saving checkpoint at 136000 iterations
2023-02-19 14:51:48,343 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:51:48,343 - mmseg - INFO - Iter [136000/160000]	lr: 9.000e-06, eta: 1:54:20, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1002, decode.acc_seg: 95.5142, aux.loss_ce: 0.0592, aux.acc_seg: 93.5401, loss: 0.1594, grad_norm: 1.2195
2023-02-19 14:52:02,504 - mmseg - INFO - Iter [136050/160000]	lr: 8.982e-06, eta: 1:54:06, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0935, decode.acc_seg: 95.8119, aux.loss_ce: 0.0569, aux.acc_seg: 93.8374, loss: 0.1504, grad_norm: 1.4170
2023-02-19 14:52:16,509 - mmseg - INFO - Iter [136100/160000]	lr: 8.963e-06, eta: 1:53:52, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0991, decode.acc_seg: 95.6368, aux.loss_ce: 0.0586, aux.acc_seg: 93.5973, loss: 0.1577, grad_norm: 1.2404
2023-02-19 14:52:31,103 - mmseg - INFO - Iter [136150/160000]	lr: 8.944e-06, eta: 1:53:37, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.8701, aux.loss_ce: 0.0566, aux.acc_seg: 93.8512, loss: 0.1498, grad_norm: 1.1863
2023-02-19 14:52:45,029 - mmseg - INFO - Iter [136200/160000]	lr: 8.925e-06, eta: 1:53:23, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0919, decode.acc_seg: 95.8268, aux.loss_ce: 0.0555, aux.acc_seg: 93.8950, loss: 0.1473, grad_norm: 1.3591
2023-02-19 14:52:58,776 - mmseg - INFO - Iter [136250/160000]	lr: 8.907e-06, eta: 1:53:09, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0986, decode.acc_seg: 95.6221, aux.loss_ce: 0.0607, aux.acc_seg: 93.3482, loss: 0.1592, grad_norm: 1.7450
2023-02-19 14:53:12,578 - mmseg - INFO - Iter [136300/160000]	lr: 8.888e-06, eta: 1:52:54, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0920, decode.acc_seg: 95.8037, aux.loss_ce: 0.0556, aux.acc_seg: 93.8931, loss: 0.1476, grad_norm: 1.2645
2023-02-19 14:53:26,195 - mmseg - INFO - Iter [136350/160000]	lr: 8.869e-06, eta: 1:52:40, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0967, decode.acc_seg: 95.6485, aux.loss_ce: 0.0579, aux.acc_seg: 93.6687, loss: 0.1546, grad_norm: 1.3888
2023-02-19 14:53:40,468 - mmseg - INFO - Iter [136400/160000]	lr: 8.850e-06, eta: 1:52:26, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0956, decode.acc_seg: 95.8232, aux.loss_ce: 0.0593, aux.acc_seg: 93.7229, loss: 0.1549, grad_norm: 1.4221
2023-02-19 14:53:57,551 - mmseg - INFO - Iter [136450/160000]	lr: 8.832e-06, eta: 1:52:12, time: 0.342, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0942, decode.acc_seg: 95.7486, aux.loss_ce: 0.0567, aux.acc_seg: 93.7641, loss: 0.1510, grad_norm: 1.5197
2023-02-19 14:54:12,117 - mmseg - INFO - Iter [136500/160000]	lr: 8.813e-06, eta: 1:51:57, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0911, decode.acc_seg: 95.9937, aux.loss_ce: 0.0551, aux.acc_seg: 94.0844, loss: 0.1462, grad_norm: 1.1693
2023-02-19 14:54:26,124 - mmseg - INFO - Iter [136550/160000]	lr: 8.794e-06, eta: 1:51:43, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0940, decode.acc_seg: 95.8078, aux.loss_ce: 0.0586, aux.acc_seg: 93.6392, loss: 0.1526, grad_norm: 1.6009
2023-02-19 14:54:40,024 - mmseg - INFO - Iter [136600/160000]	lr: 8.775e-06, eta: 1:51:29, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0947, decode.acc_seg: 95.7955, aux.loss_ce: 0.0557, aux.acc_seg: 93.9705, loss: 0.1504, grad_norm: 1.1710
2023-02-19 14:54:54,193 - mmseg - INFO - Iter [136650/160000]	lr: 8.757e-06, eta: 1:51:14, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0983, decode.acc_seg: 95.7464, aux.loss_ce: 0.0596, aux.acc_seg: 93.6701, loss: 0.1579, grad_norm: 1.7282
2023-02-19 14:55:07,920 - mmseg - INFO - Iter [136700/160000]	lr: 8.738e-06, eta: 1:51:00, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0992, decode.acc_seg: 95.6402, aux.loss_ce: 0.0601, aux.acc_seg: 93.5987, loss: 0.1593, grad_norm: 1.3207
2023-02-19 14:55:22,461 - mmseg - INFO - Iter [136750/160000]	lr: 8.719e-06, eta: 1:50:46, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0956, decode.acc_seg: 95.7789, aux.loss_ce: 0.0566, aux.acc_seg: 93.8960, loss: 0.1522, grad_norm: 1.1852
2023-02-19 14:55:36,190 - mmseg - INFO - Iter [136800/160000]	lr: 8.700e-06, eta: 1:50:31, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0971, decode.acc_seg: 95.6662, aux.loss_ce: 0.0565, aux.acc_seg: 93.7859, loss: 0.1536, grad_norm: 1.3818
2023-02-19 14:55:50,662 - mmseg - INFO - Iter [136850/160000]	lr: 8.682e-06, eta: 1:50:17, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0964, decode.acc_seg: 95.7014, aux.loss_ce: 0.0572, aux.acc_seg: 93.7138, loss: 0.1537, grad_norm: 1.6912
2023-02-19 14:56:04,631 - mmseg - INFO - Iter [136900/160000]	lr: 8.663e-06, eta: 1:50:03, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1074, decode.acc_seg: 95.4649, aux.loss_ce: 0.0615, aux.acc_seg: 93.5593, loss: 0.1689, grad_norm: 1.5892
2023-02-19 14:56:19,009 - mmseg - INFO - Iter [136950/160000]	lr: 8.644e-06, eta: 1:49:49, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0972, decode.acc_seg: 95.6180, aux.loss_ce: 0.0592, aux.acc_seg: 93.5394, loss: 0.1565, grad_norm: 1.4464
2023-02-19 14:56:32,730 - mmseg - INFO - Saving checkpoint at 137000 iterations
2023-02-19 14:56:35,951 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 14:56:35,952 - mmseg - INFO - Iter [137000/160000]	lr: 8.625e-06, eta: 1:49:35, time: 0.339, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0953, decode.acc_seg: 95.7981, aux.loss_ce: 0.0587, aux.acc_seg: 93.7058, loss: 0.1540, grad_norm: 1.2192
2023-02-19 14:56:49,731 - mmseg - INFO - Iter [137050/160000]	lr: 8.607e-06, eta: 1:49:20, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0979, decode.acc_seg: 95.7393, aux.loss_ce: 0.0606, aux.acc_seg: 93.5248, loss: 0.1585, grad_norm: 1.2176
2023-02-19 14:57:04,097 - mmseg - INFO - Iter [137100/160000]	lr: 8.588e-06, eta: 1:49:06, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0955, decode.acc_seg: 95.7835, aux.loss_ce: 0.0587, aux.acc_seg: 93.6793, loss: 0.1542, grad_norm: 1.3122
2023-02-19 14:57:18,669 - mmseg - INFO - Iter [137150/160000]	lr: 8.569e-06, eta: 1:48:52, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0945, decode.acc_seg: 95.8538, aux.loss_ce: 0.0567, aux.acc_seg: 93.9310, loss: 0.1512, grad_norm: 1.3967
2023-02-19 14:57:32,948 - mmseg - INFO - Iter [137200/160000]	lr: 8.550e-06, eta: 1:48:38, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0934, decode.acc_seg: 95.8014, aux.loss_ce: 0.0558, aux.acc_seg: 93.9174, loss: 0.1492, grad_norm: 1.8321
2023-02-19 14:57:46,815 - mmseg - INFO - Iter [137250/160000]	lr: 8.532e-06, eta: 1:48:23, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0973, decode.acc_seg: 95.5853, aux.loss_ce: 0.0575, aux.acc_seg: 93.5833, loss: 0.1548, grad_norm: 1.5638
2023-02-19 14:58:00,704 - mmseg - INFO - Iter [137300/160000]	lr: 8.513e-06, eta: 1:48:09, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1032, decode.acc_seg: 95.4511, aux.loss_ce: 0.0612, aux.acc_seg: 93.4401, loss: 0.1644, grad_norm: 1.6067
2023-02-19 14:58:14,682 - mmseg - INFO - Iter [137350/160000]	lr: 8.494e-06, eta: 1:47:54, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0987, decode.acc_seg: 95.5184, aux.loss_ce: 0.0585, aux.acc_seg: 93.5771, loss: 0.1572, grad_norm: 1.6399
2023-02-19 14:58:28,715 - mmseg - INFO - Iter [137400/160000]	lr: 8.475e-06, eta: 1:47:40, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0987, decode.acc_seg: 95.6349, aux.loss_ce: 0.0588, aux.acc_seg: 93.6991, loss: 0.1576, grad_norm: 1.4473
2023-02-19 14:58:42,887 - mmseg - INFO - Iter [137450/160000]	lr: 8.457e-06, eta: 1:47:26, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0954, decode.acc_seg: 95.8594, aux.loss_ce: 0.0573, aux.acc_seg: 93.9504, loss: 0.1528, grad_norm: 1.2236
2023-02-19 14:58:56,535 - mmseg - INFO - Iter [137500/160000]	lr: 8.438e-06, eta: 1:47:11, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0957, decode.acc_seg: 95.6991, aux.loss_ce: 0.0571, aux.acc_seg: 93.7269, loss: 0.1528, grad_norm: 1.5854
2023-02-19 14:59:10,419 - mmseg - INFO - Iter [137550/160000]	lr: 8.419e-06, eta: 1:46:57, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0940, decode.acc_seg: 95.7563, aux.loss_ce: 0.0553, aux.acc_seg: 94.0046, loss: 0.1493, grad_norm: 1.1570
2023-02-19 14:59:25,126 - mmseg - INFO - Iter [137600/160000]	lr: 8.400e-06, eta: 1:46:43, time: 0.294, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0960, decode.acc_seg: 95.7270, aux.loss_ce: 0.0569, aux.acc_seg: 93.9009, loss: 0.1529, grad_norm: 1.1958
2023-02-19 14:59:39,060 - mmseg - INFO - Iter [137650/160000]	lr: 8.382e-06, eta: 1:46:28, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.8472, aux.loss_ce: 0.0578, aux.acc_seg: 93.7933, loss: 0.1519, grad_norm: 1.7726
2023-02-19 14:59:54,875 - mmseg - INFO - Iter [137700/160000]	lr: 8.363e-06, eta: 1:46:14, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0910, decode.acc_seg: 95.8172, aux.loss_ce: 0.0550, aux.acc_seg: 93.9549, loss: 0.1460, grad_norm: 1.3061
2023-02-19 15:00:09,366 - mmseg - INFO - Iter [137750/160000]	lr: 8.344e-06, eta: 1:46:00, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0909, decode.acc_seg: 95.8157, aux.loss_ce: 0.0535, aux.acc_seg: 94.0808, loss: 0.1444, grad_norm: 1.1893
2023-02-19 15:00:23,812 - mmseg - INFO - Iter [137800/160000]	lr: 8.325e-06, eta: 1:45:46, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0908, decode.acc_seg: 95.8909, aux.loss_ce: 0.0547, aux.acc_seg: 93.9820, loss: 0.1455, grad_norm: 1.5934
2023-02-19 15:00:38,365 - mmseg - INFO - Iter [137850/160000]	lr: 8.307e-06, eta: 1:45:32, time: 0.291, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0961, decode.acc_seg: 95.7257, aux.loss_ce: 0.0570, aux.acc_seg: 93.8448, loss: 0.1531, grad_norm: 1.2156
2023-02-19 15:00:52,411 - mmseg - INFO - Iter [137900/160000]	lr: 8.288e-06, eta: 1:45:17, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0908, decode.acc_seg: 95.9037, aux.loss_ce: 0.0548, aux.acc_seg: 93.9577, loss: 0.1456, grad_norm: 1.2067
2023-02-19 15:01:06,258 - mmseg - INFO - Iter [137950/160000]	lr: 8.269e-06, eta: 1:45:03, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0927, decode.acc_seg: 95.7796, aux.loss_ce: 0.0549, aux.acc_seg: 93.9571, loss: 0.1476, grad_norm: 1.3207
2023-02-19 15:01:20,327 - mmseg - INFO - Saving checkpoint at 138000 iterations
2023-02-19 15:01:23,643 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:01:23,643 - mmseg - INFO - Iter [138000/160000]	lr: 8.250e-06, eta: 1:44:49, time: 0.348, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0964, decode.acc_seg: 95.7044, aux.loss_ce: 0.0570, aux.acc_seg: 93.7844, loss: 0.1534, grad_norm: 1.3142
2023-02-19 15:01:38,179 - mmseg - INFO - Iter [138050/160000]	lr: 8.232e-06, eta: 1:44:35, time: 0.290, data_time: 0.004, memory: 15214, decode.loss_ce: 0.1026, decode.acc_seg: 95.4345, aux.loss_ce: 0.0579, aux.acc_seg: 93.6743, loss: 0.1605, grad_norm: 1.9677
2023-02-19 15:01:51,747 - mmseg - INFO - Iter [138100/160000]	lr: 8.213e-06, eta: 1:44:20, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1003, decode.acc_seg: 95.5169, aux.loss_ce: 0.0586, aux.acc_seg: 93.7005, loss: 0.1588, grad_norm: 1.6368
2023-02-19 15:02:06,285 - mmseg - INFO - Iter [138150/160000]	lr: 8.194e-06, eta: 1:44:06, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0984, decode.acc_seg: 95.7142, aux.loss_ce: 0.0573, aux.acc_seg: 93.8222, loss: 0.1557, grad_norm: 1.3031
2023-02-19 15:02:19,851 - mmseg - INFO - Iter [138200/160000]	lr: 8.175e-06, eta: 1:43:52, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0961, decode.acc_seg: 95.6619, aux.loss_ce: 0.0564, aux.acc_seg: 93.8774, loss: 0.1525, grad_norm: 1.2824
2023-02-19 15:02:33,439 - mmseg - INFO - Iter [138250/160000]	lr: 8.157e-06, eta: 1:43:37, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1019, decode.acc_seg: 95.5042, aux.loss_ce: 0.0596, aux.acc_seg: 93.6129, loss: 0.1615, grad_norm: 1.3603
2023-02-19 15:02:47,537 - mmseg - INFO - Iter [138300/160000]	lr: 8.138e-06, eta: 1:43:23, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1011, decode.acc_seg: 95.5145, aux.loss_ce: 0.0586, aux.acc_seg: 93.7531, loss: 0.1597, grad_norm: 2.0557
2023-02-19 15:03:01,716 - mmseg - INFO - Iter [138350/160000]	lr: 8.119e-06, eta: 1:43:09, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0963, decode.acc_seg: 95.6329, aux.loss_ce: 0.0567, aux.acc_seg: 93.7761, loss: 0.1530, grad_norm: 1.7408
2023-02-19 15:03:15,337 - mmseg - INFO - Iter [138400/160000]	lr: 8.100e-06, eta: 1:42:54, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0973, decode.acc_seg: 95.7320, aux.loss_ce: 0.0596, aux.acc_seg: 93.6211, loss: 0.1569, grad_norm: 1.4337
2023-02-19 15:03:29,250 - mmseg - INFO - Iter [138450/160000]	lr: 8.082e-06, eta: 1:42:40, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0944, decode.acc_seg: 95.9354, aux.loss_ce: 0.0569, aux.acc_seg: 94.0313, loss: 0.1513, grad_norm: 1.5609
2023-02-19 15:03:42,785 - mmseg - INFO - Iter [138500/160000]	lr: 8.063e-06, eta: 1:42:26, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0971, decode.acc_seg: 95.7040, aux.loss_ce: 0.0587, aux.acc_seg: 93.7084, loss: 0.1558, grad_norm: 1.7895
2023-02-19 15:03:56,855 - mmseg - INFO - Iter [138550/160000]	lr: 8.044e-06, eta: 1:42:11, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0960, decode.acc_seg: 95.7156, aux.loss_ce: 0.0573, aux.acc_seg: 93.7761, loss: 0.1533, grad_norm: 1.1425
2023-02-19 15:04:10,846 - mmseg - INFO - Iter [138600/160000]	lr: 8.025e-06, eta: 1:41:57, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0980, decode.acc_seg: 95.7003, aux.loss_ce: 0.0600, aux.acc_seg: 93.5687, loss: 0.1580, grad_norm: 1.2364
2023-02-19 15:04:24,896 - mmseg - INFO - Iter [138650/160000]	lr: 8.007e-06, eta: 1:41:43, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0959, decode.acc_seg: 95.7318, aux.loss_ce: 0.0564, aux.acc_seg: 93.9074, loss: 0.1524, grad_norm: 1.2254
2023-02-19 15:04:38,925 - mmseg - INFO - Iter [138700/160000]	lr: 7.988e-06, eta: 1:41:28, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0949, decode.acc_seg: 95.7363, aux.loss_ce: 0.0562, aux.acc_seg: 93.8801, loss: 0.1511, grad_norm: 1.1465
2023-02-19 15:04:52,530 - mmseg - INFO - Iter [138750/160000]	lr: 7.969e-06, eta: 1:41:14, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0890, decode.acc_seg: 95.9150, aux.loss_ce: 0.0548, aux.acc_seg: 93.9081, loss: 0.1438, grad_norm: 1.2949
2023-02-19 15:05:06,417 - mmseg - INFO - Iter [138800/160000]	lr: 7.950e-06, eta: 1:40:59, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0946, decode.acc_seg: 95.8030, aux.loss_ce: 0.0568, aux.acc_seg: 93.9179, loss: 0.1514, grad_norm: 1.2037
2023-02-19 15:05:20,476 - mmseg - INFO - Iter [138850/160000]	lr: 7.932e-06, eta: 1:40:45, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0947, decode.acc_seg: 95.7234, aux.loss_ce: 0.0566, aux.acc_seg: 93.8698, loss: 0.1513, grad_norm: 1.2878
2023-02-19 15:05:34,591 - mmseg - INFO - Iter [138900/160000]	lr: 7.913e-06, eta: 1:40:31, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0952, decode.acc_seg: 95.7186, aux.loss_ce: 0.0545, aux.acc_seg: 94.0338, loss: 0.1497, grad_norm: 1.0648
2023-02-19 15:05:50,399 - mmseg - INFO - Iter [138950/160000]	lr: 7.894e-06, eta: 1:40:17, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0976, decode.acc_seg: 95.6810, aux.loss_ce: 0.0584, aux.acc_seg: 93.7075, loss: 0.1561, grad_norm: 1.4629
2023-02-19 15:06:04,150 - mmseg - INFO - Saving checkpoint at 139000 iterations
2023-02-19 15:06:07,400 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:06:07,400 - mmseg - INFO - Iter [139000/160000]	lr: 7.875e-06, eta: 1:40:03, time: 0.340, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0929, decode.acc_seg: 95.8393, aux.loss_ce: 0.0550, aux.acc_seg: 94.0348, loss: 0.1479, grad_norm: 1.1549
2023-02-19 15:06:21,106 - mmseg - INFO - Iter [139050/160000]	lr: 7.857e-06, eta: 1:39:49, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0922, decode.acc_seg: 95.8610, aux.loss_ce: 0.0555, aux.acc_seg: 93.9495, loss: 0.1477, grad_norm: 1.1676
2023-02-19 15:06:34,901 - mmseg - INFO - Iter [139100/160000]	lr: 7.838e-06, eta: 1:39:34, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0927, decode.acc_seg: 95.8986, aux.loss_ce: 0.0566, aux.acc_seg: 93.8497, loss: 0.1493, grad_norm: 1.1196
2023-02-19 15:06:48,403 - mmseg - INFO - Iter [139150/160000]	lr: 7.819e-06, eta: 1:39:20, time: 0.270, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0992, decode.acc_seg: 95.6183, aux.loss_ce: 0.0583, aux.acc_seg: 93.6694, loss: 0.1575, grad_norm: 1.3255
2023-02-19 15:07:02,011 - mmseg - INFO - Iter [139200/160000]	lr: 7.800e-06, eta: 1:39:05, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0956, decode.acc_seg: 95.7542, aux.loss_ce: 0.0572, aux.acc_seg: 93.7899, loss: 0.1529, grad_norm: 1.3023
2023-02-19 15:07:16,122 - mmseg - INFO - Iter [139250/160000]	lr: 7.782e-06, eta: 1:38:51, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1008, decode.acc_seg: 95.4818, aux.loss_ce: 0.0599, aux.acc_seg: 93.6066, loss: 0.1606, grad_norm: 1.6587
2023-02-19 15:07:29,705 - mmseg - INFO - Iter [139300/160000]	lr: 7.763e-06, eta: 1:38:37, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0925, decode.acc_seg: 95.8298, aux.loss_ce: 0.0538, aux.acc_seg: 94.1195, loss: 0.1462, grad_norm: 1.2530
2023-02-19 15:07:43,413 - mmseg - INFO - Iter [139350/160000]	lr: 7.744e-06, eta: 1:38:22, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0931, decode.acc_seg: 95.8640, aux.loss_ce: 0.0554, aux.acc_seg: 94.0139, loss: 0.1485, grad_norm: 1.1080
2023-02-19 15:07:57,070 - mmseg - INFO - Iter [139400/160000]	lr: 7.725e-06, eta: 1:38:08, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0979, decode.acc_seg: 95.6336, aux.loss_ce: 0.0588, aux.acc_seg: 93.6435, loss: 0.1567, grad_norm: 1.2510
2023-02-19 15:08:11,159 - mmseg - INFO - Iter [139450/160000]	lr: 7.707e-06, eta: 1:37:54, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.8770, aux.loss_ce: 0.0561, aux.acc_seg: 93.9437, loss: 0.1502, grad_norm: 1.4175
2023-02-19 15:08:24,904 - mmseg - INFO - Iter [139500/160000]	lr: 7.688e-06, eta: 1:37:39, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0959, decode.acc_seg: 95.7595, aux.loss_ce: 0.0551, aux.acc_seg: 94.0020, loss: 0.1510, grad_norm: 1.3027
2023-02-19 15:08:38,519 - mmseg - INFO - Iter [139550/160000]	lr: 7.669e-06, eta: 1:37:25, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0949, decode.acc_seg: 95.7958, aux.loss_ce: 0.0561, aux.acc_seg: 93.9286, loss: 0.1510, grad_norm: 1.1370
2023-02-19 15:08:52,081 - mmseg - INFO - Iter [139600/160000]	lr: 7.650e-06, eta: 1:37:10, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0956, decode.acc_seg: 95.7042, aux.loss_ce: 0.0561, aux.acc_seg: 93.8785, loss: 0.1516, grad_norm: 1.4205
2023-02-19 15:09:05,887 - mmseg - INFO - Iter [139650/160000]	lr: 7.632e-06, eta: 1:36:56, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0949, decode.acc_seg: 95.7837, aux.loss_ce: 0.0573, aux.acc_seg: 93.8243, loss: 0.1523, grad_norm: 1.2141
2023-02-19 15:09:19,866 - mmseg - INFO - Iter [139700/160000]	lr: 7.613e-06, eta: 1:36:42, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1036, decode.acc_seg: 95.4388, aux.loss_ce: 0.0615, aux.acc_seg: 93.3377, loss: 0.1651, grad_norm: 1.7489
2023-02-19 15:09:34,218 - mmseg - INFO - Iter [139750/160000]	lr: 7.594e-06, eta: 1:36:27, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0952, decode.acc_seg: 95.7918, aux.loss_ce: 0.0580, aux.acc_seg: 93.7708, loss: 0.1532, grad_norm: 1.2678
2023-02-19 15:09:47,979 - mmseg - INFO - Iter [139800/160000]	lr: 7.575e-06, eta: 1:36:13, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1025, decode.acc_seg: 95.4753, aux.loss_ce: 0.0611, aux.acc_seg: 93.4513, loss: 0.1636, grad_norm: 1.3863
2023-02-19 15:10:01,859 - mmseg - INFO - Iter [139850/160000]	lr: 7.557e-06, eta: 1:35:59, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0928, decode.acc_seg: 95.8618, aux.loss_ce: 0.0566, aux.acc_seg: 93.8874, loss: 0.1494, grad_norm: 1.4840
2023-02-19 15:10:15,475 - mmseg - INFO - Iter [139900/160000]	lr: 7.538e-06, eta: 1:35:44, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0964, decode.acc_seg: 95.5907, aux.loss_ce: 0.0575, aux.acc_seg: 93.6440, loss: 0.1539, grad_norm: 1.2849
2023-02-19 15:10:29,437 - mmseg - INFO - Iter [139950/160000]	lr: 7.519e-06, eta: 1:35:30, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0908, decode.acc_seg: 95.9810, aux.loss_ce: 0.0540, aux.acc_seg: 94.2229, loss: 0.1447, grad_norm: 1.0922
2023-02-19 15:10:43,097 - mmseg - INFO - Saving checkpoint at 140000 iterations
2023-02-19 15:10:46,411 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:10:46,411 - mmseg - INFO - Iter [140000/160000]	lr: 7.500e-06, eta: 1:35:16, time: 0.340, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0948, decode.acc_seg: 95.7544, aux.loss_ce: 0.0573, aux.acc_seg: 93.7067, loss: 0.1520, grad_norm: 1.2173
2023-02-19 15:11:00,206 - mmseg - INFO - Iter [140050/160000]	lr: 7.482e-06, eta: 1:35:02, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0977, decode.acc_seg: 95.6221, aux.loss_ce: 0.0571, aux.acc_seg: 93.7539, loss: 0.1549, grad_norm: 1.4255
2023-02-19 15:11:14,361 - mmseg - INFO - Iter [140100/160000]	lr: 7.463e-06, eta: 1:34:47, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0990, decode.acc_seg: 95.5764, aux.loss_ce: 0.0572, aux.acc_seg: 93.8336, loss: 0.1561, grad_norm: 1.3711
2023-02-19 15:11:28,647 - mmseg - INFO - Iter [140150/160000]	lr: 7.444e-06, eta: 1:34:33, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0922, decode.acc_seg: 95.8506, aux.loss_ce: 0.0546, aux.acc_seg: 94.0124, loss: 0.1468, grad_norm: 1.1746
2023-02-19 15:11:44,898 - mmseg - INFO - Iter [140200/160000]	lr: 7.425e-06, eta: 1:34:19, time: 0.325, data_time: 0.048, memory: 15214, decode.loss_ce: 0.0989, decode.acc_seg: 95.6266, aux.loss_ce: 0.0582, aux.acc_seg: 93.7352, loss: 0.1571, grad_norm: 1.3290
2023-02-19 15:11:58,995 - mmseg - INFO - Iter [140250/160000]	lr: 7.407e-06, eta: 1:34:05, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0901, decode.acc_seg: 95.8827, aux.loss_ce: 0.0528, aux.acc_seg: 94.1214, loss: 0.1430, grad_norm: 1.1791
2023-02-19 15:12:12,786 - mmseg - INFO - Iter [140300/160000]	lr: 7.388e-06, eta: 1:33:50, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0948, decode.acc_seg: 95.7949, aux.loss_ce: 0.0566, aux.acc_seg: 93.8704, loss: 0.1514, grad_norm: 1.0794
2023-02-19 15:12:26,687 - mmseg - INFO - Iter [140350/160000]	lr: 7.369e-06, eta: 1:33:36, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0962, decode.acc_seg: 95.7378, aux.loss_ce: 0.0582, aux.acc_seg: 93.7056, loss: 0.1544, grad_norm: 1.2730
2023-02-19 15:12:40,468 - mmseg - INFO - Iter [140400/160000]	lr: 7.350e-06, eta: 1:33:22, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1007, decode.acc_seg: 95.4481, aux.loss_ce: 0.0588, aux.acc_seg: 93.6017, loss: 0.1595, grad_norm: 1.3200
2023-02-19 15:12:54,173 - mmseg - INFO - Iter [140450/160000]	lr: 7.332e-06, eta: 1:33:07, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0996, decode.acc_seg: 95.5607, aux.loss_ce: 0.0601, aux.acc_seg: 93.5344, loss: 0.1597, grad_norm: 1.6007
2023-02-19 15:13:08,035 - mmseg - INFO - Iter [140500/160000]	lr: 7.313e-06, eta: 1:32:53, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0964, decode.acc_seg: 95.7529, aux.loss_ce: 0.0576, aux.acc_seg: 93.8297, loss: 0.1539, grad_norm: 1.1693
2023-02-19 15:13:22,751 - mmseg - INFO - Iter [140550/160000]	lr: 7.294e-06, eta: 1:32:39, time: 0.294, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0938, decode.acc_seg: 95.8050, aux.loss_ce: 0.0571, aux.acc_seg: 93.8000, loss: 0.1510, grad_norm: 1.4304
2023-02-19 15:13:36,329 - mmseg - INFO - Iter [140600/160000]	lr: 7.275e-06, eta: 1:32:24, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0933, decode.acc_seg: 95.8124, aux.loss_ce: 0.0555, aux.acc_seg: 93.9756, loss: 0.1488, grad_norm: 1.1241
2023-02-19 15:13:50,458 - mmseg - INFO - Iter [140650/160000]	lr: 7.257e-06, eta: 1:32:10, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0934, decode.acc_seg: 95.8225, aux.loss_ce: 0.0569, aux.acc_seg: 93.7799, loss: 0.1503, grad_norm: 1.1729
2023-02-19 15:14:04,381 - mmseg - INFO - Iter [140700/160000]	lr: 7.238e-06, eta: 1:31:56, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0933, decode.acc_seg: 95.8403, aux.loss_ce: 0.0574, aux.acc_seg: 93.8734, loss: 0.1508, grad_norm: 1.3815
2023-02-19 15:14:18,613 - mmseg - INFO - Iter [140750/160000]	lr: 7.219e-06, eta: 1:31:41, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0911, decode.acc_seg: 95.8128, aux.loss_ce: 0.0545, aux.acc_seg: 93.9206, loss: 0.1457, grad_norm: 1.2880
2023-02-19 15:14:32,742 - mmseg - INFO - Iter [140800/160000]	lr: 7.200e-06, eta: 1:31:27, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0965, decode.acc_seg: 95.7241, aux.loss_ce: 0.0577, aux.acc_seg: 93.7405, loss: 0.1542, grad_norm: 1.5616
2023-02-19 15:14:47,902 - mmseg - INFO - Iter [140850/160000]	lr: 7.182e-06, eta: 1:31:13, time: 0.303, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1003, decode.acc_seg: 95.6319, aux.loss_ce: 0.0610, aux.acc_seg: 93.4730, loss: 0.1613, grad_norm: 1.4886
2023-02-19 15:15:01,958 - mmseg - INFO - Iter [140900/160000]	lr: 7.163e-06, eta: 1:30:59, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0967, decode.acc_seg: 95.6962, aux.loss_ce: 0.0586, aux.acc_seg: 93.6982, loss: 0.1552, grad_norm: 1.3177
2023-02-19 15:15:15,803 - mmseg - INFO - Iter [140950/160000]	lr: 7.144e-06, eta: 1:30:44, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0891, decode.acc_seg: 95.8509, aux.loss_ce: 0.0547, aux.acc_seg: 93.8685, loss: 0.1438, grad_norm: 1.4557
2023-02-19 15:15:29,510 - mmseg - INFO - Saving checkpoint at 141000 iterations
2023-02-19 15:15:32,798 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:15:32,798 - mmseg - INFO - Iter [141000/160000]	lr: 7.125e-06, eta: 1:30:30, time: 0.340, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0930, decode.acc_seg: 95.7807, aux.loss_ce: 0.0552, aux.acc_seg: 93.9771, loss: 0.1482, grad_norm: 1.1747
2023-02-19 15:15:47,031 - mmseg - INFO - Iter [141050/160000]	lr: 7.107e-06, eta: 1:30:16, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0944, decode.acc_seg: 95.7836, aux.loss_ce: 0.0569, aux.acc_seg: 93.7883, loss: 0.1513, grad_norm: 1.3284
2023-02-19 15:16:01,024 - mmseg - INFO - Iter [141100/160000]	lr: 7.088e-06, eta: 1:30:02, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0954, decode.acc_seg: 95.6781, aux.loss_ce: 0.0547, aux.acc_seg: 93.9332, loss: 0.1501, grad_norm: 1.2189
2023-02-19 15:16:14,652 - mmseg - INFO - Iter [141150/160000]	lr: 7.069e-06, eta: 1:29:47, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0994, decode.acc_seg: 95.6830, aux.loss_ce: 0.0601, aux.acc_seg: 93.5668, loss: 0.1595, grad_norm: 1.3146
2023-02-19 15:16:28,427 - mmseg - INFO - Iter [141200/160000]	lr: 7.050e-06, eta: 1:29:33, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0974, decode.acc_seg: 95.7456, aux.loss_ce: 0.0586, aux.acc_seg: 93.7339, loss: 0.1560, grad_norm: 1.3536
2023-02-19 15:16:42,532 - mmseg - INFO - Iter [141250/160000]	lr: 7.032e-06, eta: 1:29:19, time: 0.283, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0903, decode.acc_seg: 95.9423, aux.loss_ce: 0.0543, aux.acc_seg: 94.0213, loss: 0.1446, grad_norm: 1.4370
2023-02-19 15:16:56,824 - mmseg - INFO - Iter [141300/160000]	lr: 7.013e-06, eta: 1:29:04, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0906, decode.acc_seg: 95.9196, aux.loss_ce: 0.0541, aux.acc_seg: 94.0883, loss: 0.1447, grad_norm: 1.4699
2023-02-19 15:17:10,927 - mmseg - INFO - Iter [141350/160000]	lr: 6.994e-06, eta: 1:28:50, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0880, decode.acc_seg: 95.9098, aux.loss_ce: 0.0522, aux.acc_seg: 94.1429, loss: 0.1402, grad_norm: 1.0309
2023-02-19 15:17:24,568 - mmseg - INFO - Iter [141400/160000]	lr: 6.975e-06, eta: 1:28:36, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0976, decode.acc_seg: 95.6709, aux.loss_ce: 0.0574, aux.acc_seg: 93.8069, loss: 0.1550, grad_norm: 1.2270
2023-02-19 15:17:38,658 - mmseg - INFO - Iter [141450/160000]	lr: 6.957e-06, eta: 1:28:21, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0923, decode.acc_seg: 95.8481, aux.loss_ce: 0.0565, aux.acc_seg: 93.8448, loss: 0.1488, grad_norm: 1.1701
2023-02-19 15:17:54,785 - mmseg - INFO - Iter [141500/160000]	lr: 6.938e-06, eta: 1:28:07, time: 0.322, data_time: 0.048, memory: 15214, decode.loss_ce: 0.0980, decode.acc_seg: 95.6070, aux.loss_ce: 0.0574, aux.acc_seg: 93.7956, loss: 0.1554, grad_norm: 1.4030
2023-02-19 15:18:08,742 - mmseg - INFO - Iter [141550/160000]	lr: 6.919e-06, eta: 1:27:53, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0996, decode.acc_seg: 95.4984, aux.loss_ce: 0.0598, aux.acc_seg: 93.4614, loss: 0.1594, grad_norm: 1.4019
2023-02-19 15:18:23,000 - mmseg - INFO - Iter [141600/160000]	lr: 6.900e-06, eta: 1:27:39, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.7931, aux.loss_ce: 0.0556, aux.acc_seg: 93.8723, loss: 0.1497, grad_norm: 1.5541
2023-02-19 15:18:36,906 - mmseg - INFO - Iter [141650/160000]	lr: 6.882e-06, eta: 1:27:24, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.8854, aux.loss_ce: 0.0569, aux.acc_seg: 93.8047, loss: 0.1501, grad_norm: 1.2035
2023-02-19 15:18:51,598 - mmseg - INFO - Iter [141700/160000]	lr: 6.863e-06, eta: 1:27:10, time: 0.293, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0937, decode.acc_seg: 95.7835, aux.loss_ce: 0.0539, aux.acc_seg: 94.1866, loss: 0.1476, grad_norm: 1.1933
2023-02-19 15:19:05,463 - mmseg - INFO - Iter [141750/160000]	lr: 6.844e-06, eta: 1:26:56, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1045, decode.acc_seg: 95.4631, aux.loss_ce: 0.0590, aux.acc_seg: 93.5694, loss: 0.1635, grad_norm: 1.8872
2023-02-19 15:19:19,154 - mmseg - INFO - Iter [141800/160000]	lr: 6.825e-06, eta: 1:26:41, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0943, decode.acc_seg: 95.8919, aux.loss_ce: 0.0575, aux.acc_seg: 93.8368, loss: 0.1518, grad_norm: 1.6471
2023-02-19 15:19:32,751 - mmseg - INFO - Iter [141850/160000]	lr: 6.807e-06, eta: 1:26:27, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0937, decode.acc_seg: 95.8368, aux.loss_ce: 0.0559, aux.acc_seg: 94.0079, loss: 0.1496, grad_norm: 1.1071
2023-02-19 15:19:46,397 - mmseg - INFO - Iter [141900/160000]	lr: 6.788e-06, eta: 1:26:13, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0924, decode.acc_seg: 95.8262, aux.loss_ce: 0.0547, aux.acc_seg: 94.0290, loss: 0.1471, grad_norm: 1.3379
2023-02-19 15:20:00,311 - mmseg - INFO - Iter [141950/160000]	lr: 6.769e-06, eta: 1:25:58, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0891, decode.acc_seg: 95.9745, aux.loss_ce: 0.0550, aux.acc_seg: 93.9753, loss: 0.1441, grad_norm: 1.4810
2023-02-19 15:20:14,251 - mmseg - INFO - Saving checkpoint at 142000 iterations
2023-02-19 15:20:17,585 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:20:17,585 - mmseg - INFO - Iter [142000/160000]	lr: 6.750e-06, eta: 1:25:44, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0959, decode.acc_seg: 95.7173, aux.loss_ce: 0.0582, aux.acc_seg: 93.7011, loss: 0.1541, grad_norm: 1.3824
2023-02-19 15:20:31,311 - mmseg - INFO - Iter [142050/160000]	lr: 6.732e-06, eta: 1:25:30, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0949, decode.acc_seg: 95.7637, aux.loss_ce: 0.0572, aux.acc_seg: 93.7601, loss: 0.1521, grad_norm: 1.2784
2023-02-19 15:20:45,288 - mmseg - INFO - Iter [142100/160000]	lr: 6.713e-06, eta: 1:25:16, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0940, decode.acc_seg: 95.7890, aux.loss_ce: 0.0565, aux.acc_seg: 93.8320, loss: 0.1505, grad_norm: 1.2090
2023-02-19 15:20:59,265 - mmseg - INFO - Iter [142150/160000]	lr: 6.694e-06, eta: 1:25:01, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0990, decode.acc_seg: 95.5470, aux.loss_ce: 0.0578, aux.acc_seg: 93.6896, loss: 0.1568, grad_norm: 1.6150
2023-02-19 15:21:13,228 - mmseg - INFO - Iter [142200/160000]	lr: 6.675e-06, eta: 1:24:47, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0967, decode.acc_seg: 95.6982, aux.loss_ce: 0.0563, aux.acc_seg: 93.9839, loss: 0.1530, grad_norm: 1.4127
2023-02-19 15:21:26,968 - mmseg - INFO - Iter [142250/160000]	lr: 6.657e-06, eta: 1:24:33, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0946, decode.acc_seg: 95.6563, aux.loss_ce: 0.0552, aux.acc_seg: 93.8876, loss: 0.1497, grad_norm: 1.1296
2023-02-19 15:21:40,686 - mmseg - INFO - Iter [142300/160000]	lr: 6.638e-06, eta: 1:24:18, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0939, decode.acc_seg: 95.7799, aux.loss_ce: 0.0574, aux.acc_seg: 93.7500, loss: 0.1513, grad_norm: 1.2079
2023-02-19 15:21:55,093 - mmseg - INFO - Iter [142350/160000]	lr: 6.619e-06, eta: 1:24:04, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0978, decode.acc_seg: 95.6180, aux.loss_ce: 0.0590, aux.acc_seg: 93.5496, loss: 0.1568, grad_norm: 1.7733
2023-02-19 15:22:09,125 - mmseg - INFO - Iter [142400/160000]	lr: 6.600e-06, eta: 1:23:50, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0970, decode.acc_seg: 95.6742, aux.loss_ce: 0.0554, aux.acc_seg: 93.8941, loss: 0.1524, grad_norm: 1.3853
2023-02-19 15:22:22,702 - mmseg - INFO - Iter [142450/160000]	lr: 6.582e-06, eta: 1:23:35, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1042, decode.acc_seg: 95.3925, aux.loss_ce: 0.0612, aux.acc_seg: 93.4965, loss: 0.1654, grad_norm: 1.3469
2023-02-19 15:22:36,354 - mmseg - INFO - Iter [142500/160000]	lr: 6.563e-06, eta: 1:23:21, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0931, decode.acc_seg: 95.8695, aux.loss_ce: 0.0554, aux.acc_seg: 94.0184, loss: 0.1485, grad_norm: 1.1062
2023-02-19 15:22:50,336 - mmseg - INFO - Iter [142550/160000]	lr: 6.544e-06, eta: 1:23:07, time: 0.280, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0923, decode.acc_seg: 95.7463, aux.loss_ce: 0.0573, aux.acc_seg: 93.6295, loss: 0.1495, grad_norm: 1.5780
2023-02-19 15:23:03,910 - mmseg - INFO - Iter [142600/160000]	lr: 6.525e-06, eta: 1:22:52, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0910, decode.acc_seg: 95.8342, aux.loss_ce: 0.0560, aux.acc_seg: 93.8099, loss: 0.1470, grad_norm: 1.2797
2023-02-19 15:23:17,619 - mmseg - INFO - Iter [142650/160000]	lr: 6.507e-06, eta: 1:22:38, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0879, decode.acc_seg: 95.9186, aux.loss_ce: 0.0521, aux.acc_seg: 94.0445, loss: 0.1400, grad_norm: 1.2655
2023-02-19 15:23:31,191 - mmseg - INFO - Iter [142700/160000]	lr: 6.488e-06, eta: 1:22:23, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1025, decode.acc_seg: 95.3155, aux.loss_ce: 0.0597, aux.acc_seg: 93.3714, loss: 0.1623, grad_norm: 1.4502
2023-02-19 15:23:46,968 - mmseg - INFO - Iter [142750/160000]	lr: 6.469e-06, eta: 1:22:09, time: 0.316, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.8246, aux.loss_ce: 0.0556, aux.acc_seg: 93.9669, loss: 0.1488, grad_norm: 1.0685
2023-02-19 15:24:00,663 - mmseg - INFO - Iter [142800/160000]	lr: 6.450e-06, eta: 1:21:55, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0958, decode.acc_seg: 95.6637, aux.loss_ce: 0.0562, aux.acc_seg: 93.7611, loss: 0.1521, grad_norm: 1.2938
2023-02-19 15:24:14,551 - mmseg - INFO - Iter [142850/160000]	lr: 6.432e-06, eta: 1:21:41, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1000, decode.acc_seg: 95.5555, aux.loss_ce: 0.0593, aux.acc_seg: 93.5966, loss: 0.1593, grad_norm: 1.1975
2023-02-19 15:24:28,412 - mmseg - INFO - Iter [142900/160000]	lr: 6.413e-06, eta: 1:21:26, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0914, decode.acc_seg: 95.9236, aux.loss_ce: 0.0561, aux.acc_seg: 93.9683, loss: 0.1475, grad_norm: 1.2522
2023-02-19 15:24:42,628 - mmseg - INFO - Iter [142950/160000]	lr: 6.394e-06, eta: 1:21:12, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0900, decode.acc_seg: 95.9339, aux.loss_ce: 0.0546, aux.acc_seg: 94.0307, loss: 0.1446, grad_norm: 1.0417
2023-02-19 15:24:56,920 - mmseg - INFO - Saving checkpoint at 143000 iterations
2023-02-19 15:25:00,292 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:25:00,292 - mmseg - INFO - Iter [143000/160000]	lr: 6.375e-06, eta: 1:20:58, time: 0.353, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0989, decode.acc_seg: 95.6108, aux.loss_ce: 0.0584, aux.acc_seg: 93.6657, loss: 0.1573, grad_norm: 1.4655
2023-02-19 15:25:13,970 - mmseg - INFO - Iter [143050/160000]	lr: 6.357e-06, eta: 1:20:44, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0973, decode.acc_seg: 95.6484, aux.loss_ce: 0.0585, aux.acc_seg: 93.6561, loss: 0.1559, grad_norm: 1.4219
2023-02-19 15:25:27,985 - mmseg - INFO - Iter [143100/160000]	lr: 6.338e-06, eta: 1:20:29, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0927, decode.acc_seg: 95.8049, aux.loss_ce: 0.0558, aux.acc_seg: 93.8564, loss: 0.1485, grad_norm: 1.2326
2023-02-19 15:25:41,649 - mmseg - INFO - Iter [143150/160000]	lr: 6.319e-06, eta: 1:20:15, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0928, decode.acc_seg: 95.8371, aux.loss_ce: 0.0560, aux.acc_seg: 93.8576, loss: 0.1488, grad_norm: 1.5020
2023-02-19 15:25:55,532 - mmseg - INFO - Iter [143200/160000]	lr: 6.300e-06, eta: 1:20:01, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0937, decode.acc_seg: 95.7326, aux.loss_ce: 0.0563, aux.acc_seg: 93.7776, loss: 0.1499, grad_norm: 1.2186
2023-02-19 15:26:10,558 - mmseg - INFO - Iter [143250/160000]	lr: 6.282e-06, eta: 1:19:47, time: 0.301, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0936, decode.acc_seg: 95.8617, aux.loss_ce: 0.0559, aux.acc_seg: 93.9946, loss: 0.1495, grad_norm: 1.4416
2023-02-19 15:26:24,313 - mmseg - INFO - Iter [143300/160000]	lr: 6.263e-06, eta: 1:19:32, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0914, decode.acc_seg: 95.9623, aux.loss_ce: 0.0538, aux.acc_seg: 94.2025, loss: 0.1452, grad_norm: 1.0610
2023-02-19 15:26:37,991 - mmseg - INFO - Iter [143350/160000]	lr: 6.244e-06, eta: 1:19:18, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0967, decode.acc_seg: 95.6426, aux.loss_ce: 0.0565, aux.acc_seg: 93.8546, loss: 0.1532, grad_norm: 1.3766
2023-02-19 15:26:51,644 - mmseg - INFO - Iter [143400/160000]	lr: 6.225e-06, eta: 1:19:03, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1002, decode.acc_seg: 95.6669, aux.loss_ce: 0.0628, aux.acc_seg: 93.5269, loss: 0.1630, grad_norm: 2.0015
2023-02-19 15:27:05,337 - mmseg - INFO - Iter [143450/160000]	lr: 6.207e-06, eta: 1:18:49, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0992, decode.acc_seg: 95.7484, aux.loss_ce: 0.0570, aux.acc_seg: 93.9156, loss: 0.1562, grad_norm: 1.4342
2023-02-19 15:27:19,289 - mmseg - INFO - Iter [143500/160000]	lr: 6.188e-06, eta: 1:18:35, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0991, decode.acc_seg: 95.4932, aux.loss_ce: 0.0603, aux.acc_seg: 93.3914, loss: 0.1594, grad_norm: 1.5786
2023-02-19 15:27:32,886 - mmseg - INFO - Iter [143550/160000]	lr: 6.169e-06, eta: 1:18:20, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0919, decode.acc_seg: 95.8446, aux.loss_ce: 0.0553, aux.acc_seg: 93.9725, loss: 0.1472, grad_norm: 1.0420
2023-02-19 15:27:46,512 - mmseg - INFO - Iter [143600/160000]	lr: 6.150e-06, eta: 1:18:06, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0958, decode.acc_seg: 95.7164, aux.loss_ce: 0.0562, aux.acc_seg: 93.8868, loss: 0.1521, grad_norm: 1.5880
2023-02-19 15:28:00,290 - mmseg - INFO - Iter [143650/160000]	lr: 6.132e-06, eta: 1:17:52, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0879, decode.acc_seg: 95.9506, aux.loss_ce: 0.0530, aux.acc_seg: 94.1155, loss: 0.1408, grad_norm: 1.1996
2023-02-19 15:28:14,610 - mmseg - INFO - Iter [143700/160000]	lr: 6.113e-06, eta: 1:17:37, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0974, decode.acc_seg: 95.5924, aux.loss_ce: 0.0563, aux.acc_seg: 93.8291, loss: 0.1537, grad_norm: 1.3135
2023-02-19 15:28:28,566 - mmseg - INFO - Iter [143750/160000]	lr: 6.094e-06, eta: 1:17:23, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0958, decode.acc_seg: 95.6356, aux.loss_ce: 0.0569, aux.acc_seg: 93.6919, loss: 0.1527, grad_norm: 1.4514
2023-02-19 15:28:42,518 - mmseg - INFO - Iter [143800/160000]	lr: 6.075e-06, eta: 1:17:09, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0951, decode.acc_seg: 95.7007, aux.loss_ce: 0.0571, aux.acc_seg: 93.6857, loss: 0.1522, grad_norm: 1.4636
2023-02-19 15:28:56,491 - mmseg - INFO - Iter [143850/160000]	lr: 6.057e-06, eta: 1:16:54, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0973, decode.acc_seg: 95.6810, aux.loss_ce: 0.0562, aux.acc_seg: 93.8986, loss: 0.1535, grad_norm: 1.6563
2023-02-19 15:29:11,144 - mmseg - INFO - Iter [143900/160000]	lr: 6.038e-06, eta: 1:16:40, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0962, decode.acc_seg: 95.7683, aux.loss_ce: 0.0580, aux.acc_seg: 93.7400, loss: 0.1542, grad_norm: 1.4283
2023-02-19 15:29:25,246 - mmseg - INFO - Iter [143950/160000]	lr: 6.019e-06, eta: 1:16:26, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0945, decode.acc_seg: 95.6896, aux.loss_ce: 0.0557, aux.acc_seg: 93.8857, loss: 0.1503, grad_norm: 1.2339
2023-02-19 15:29:41,677 - mmseg - INFO - Saving checkpoint at 144000 iterations
2023-02-19 15:29:44,980 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:29:44,980 - mmseg - INFO - Iter [144000/160000]	lr: 6.000e-06, eta: 1:16:12, time: 0.395, data_time: 0.048, memory: 15214, decode.loss_ce: 0.0930, decode.acc_seg: 95.8609, aux.loss_ce: 0.0557, aux.acc_seg: 93.9527, loss: 0.1487, grad_norm: 1.1000
2023-02-19 15:29:59,257 - mmseg - INFO - per class results:
2023-02-19 15:29:59,263 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        | 79.13 | 88.47 |
|       building      | 82.74 | 92.01 |
|         sky         |  94.3 |  98.2 |
|        floor        | 81.25 | 92.15 |
|         tree        | 75.59 | 88.23 |
|       ceiling       | 85.06 |  94.9 |
|         road        | 83.57 | 89.25 |
|         bed         | 90.97 | 95.86 |
|      windowpane     | 63.63 | 79.21 |
|        grass        | 67.24 | 81.97 |
|       cabinet       | 62.03 |  73.9 |
|       sidewalk      | 69.01 | 86.01 |
|        person       | 82.78 | 92.41 |
|        earth        | 36.96 | 51.22 |
|         door        | 54.52 |  68.6 |
|        table        | 65.49 | 79.31 |
|       mountain      | 58.73 | 71.86 |
|        plant        | 51.37 | 61.33 |
|       curtain       | 76.43 | 87.52 |
|        chair        | 65.35 | 80.19 |
|         car         |  85.8 | 92.64 |
|        water        | 55.79 | 69.51 |
|       painting      | 76.85 | 90.38 |
|         sofa        | 74.94 | 85.73 |
|        shelf        | 45.55 | 65.37 |
|        house        | 37.97 | 49.71 |
|         sea         | 62.43 | 84.71 |
|        mirror       | 73.29 | 80.75 |
|         rug         | 50.78 | 59.22 |
|        field        | 30.49 | 49.38 |
|       armchair      | 50.92 | 68.46 |
|         seat        | 64.64 | 83.72 |
|        fence        | 44.68 | 56.87 |
|         desk        | 54.77 | 71.28 |
|         rock        | 50.14 | 73.77 |
|       wardrobe      | 46.39 |  67.9 |
|         lamp        | 68.23 | 82.53 |
|       bathtub       | 81.35 | 84.85 |
|       railing       | 37.59 | 49.96 |
|       cushion       | 63.64 | 81.57 |
|         base        | 37.53 | 49.78 |
|         box         | 28.91 | 35.31 |
|        column       | 50.05 | 64.18 |
|      signboard      | 41.11 | 56.27 |
|   chest of drawers  | 40.67 |  62.8 |
|       counter       | 30.55 | 41.86 |
|         sand        |  56.3 | 79.51 |
|         sink        | 74.49 | 82.85 |
|      skyscraper     | 52.58 |  66.5 |
|      fireplace      |  79.2 | 92.56 |
|     refrigerator    | 83.94 | 94.15 |
|      grandstand     | 44.39 | 77.08 |
|         path        | 26.55 | 40.24 |
|        stairs       | 30.22 | 35.46 |
|        runway       | 68.19 | 90.39 |
|         case        | 49.99 | 70.83 |
|      pool table     | 94.03 | 96.38 |
|        pillow       | 59.64 | 70.41 |
|     screen door     | 82.41 | 84.97 |
|       stairway      | 31.06 | 40.91 |
|        river        | 10.64 | 23.06 |
|        bridge       | 66.93 | 82.08 |
|       bookcase      | 46.53 | 70.39 |
|        blind        | 44.27 | 49.38 |
|     coffee table    | 63.78 | 76.57 |
|        toilet       | 87.34 | 91.12 |
|        flower       |  42.9 | 62.04 |
|         book        | 41.19 | 54.75 |
|         hill        | 12.39 | 19.18 |
|        bench        | 47.71 | 51.48 |
|      countertop     |  54.1 | 75.77 |
|        stove        | 81.91 | 86.53 |
|         palm        | 56.54 | 78.94 |
|    kitchen island   | 47.82 | 73.73 |
|       computer      | 76.77 | 86.12 |
|     swivel chair    |  43.2 |  61.5 |
|         boat        | 54.37 | 58.71 |
|         bar         | 35.84 | 46.31 |
|    arcade machine   | 42.32 | 45.05 |
|        hovel        | 44.25 | 50.07 |
|         bus         | 92.04 | 96.79 |
|        towel        | 72.71 | 82.21 |
|        light        | 56.12 | 62.38 |
|        truck        | 41.79 | 50.38 |
|        tower        | 33.69 | 43.95 |
|      chandelier     | 71.91 | 82.72 |
|        awning       | 38.33 | 49.33 |
|     streetlight     |  35.2 | 48.33 |
|        booth        | 51.64 | 62.04 |
| television receiver | 73.91 |  81.4 |
|       airplane      |  59.3 | 67.63 |
|      dirt track     | 11.61 | 37.96 |
|       apparel       | 38.17 | 58.34 |
|         pole        | 26.58 | 39.58 |
|         land        |  4.57 |  6.11 |
|      bannister      | 12.99 | 15.81 |
|      escalator      | 43.32 | 58.34 |
|       ottoman       | 48.04 | 67.65 |
|        bottle       |  36.6 | 57.24 |
|        buffet       | 32.63 | 36.58 |
|        poster       | 27.54 |  37.0 |
|        stage        | 15.91 | 20.61 |
|         van         | 40.35 | 56.92 |
|         ship        | 63.59 | 90.92 |
|       fountain      | 24.39 | 24.66 |
|    conveyer belt    | 81.01 | 91.93 |
|        canopy       | 45.35 | 53.61 |
|        washer       | 72.37 | 75.13 |
|      plaything      | 31.67 | 41.32 |
|    swimming pool    | 55.82 | 72.63 |
|        stool        | 48.58 | 62.08 |
|        barrel       | 47.08 | 74.43 |
|        basket       | 41.94 | 61.16 |
|      waterfall      | 50.63 |  60.4 |
|         tent        |  91.7 | 98.19 |
|         bag         | 17.88 |  22.0 |
|       minibike      | 70.97 | 90.05 |
|        cradle       | 83.77 | 92.58 |
|         oven        | 43.74 | 66.17 |
|         ball        | 54.28 | 70.64 |
|         food        | 51.61 |  58.1 |
|         step        |  8.38 | 10.25 |
|         tank        | 61.11 | 63.86 |
|      trade name     | 23.22 | 28.61 |
|      microwave      | 74.54 | 82.35 |
|         pot         | 54.53 | 63.87 |
|        animal       | 62.14 | 66.47 |
|       bicycle       | 60.75 | 81.15 |
|         lake        | 49.11 | 58.05 |
|      dishwasher     | 70.04 | 75.66 |
|        screen       | 54.44 |  69.5 |
|       blanket       | 22.76 | 26.69 |
|      sculpture      | 72.52 | 86.71 |
|         hood        | 73.16 | 77.06 |
|        sconce       | 52.42 | 64.32 |
|         vase        | 42.91 | 61.93 |
|    traffic light    | 36.51 | 53.18 |
|         tray        | 17.52 | 24.82 |
|        ashcan       | 44.34 | 62.12 |
|         fan         | 68.65 | 77.47 |
|         pier        | 37.23 | 58.68 |
|      crt screen     |  1.64 |  5.1  |
|        plate        | 60.94 | 78.13 |
|       monitor       |  9.22 | 10.71 |
|    bulletin board   | 35.53 | 39.83 |
|        shower       | 11.99 | 19.75 |
|       radiator      | 73.12 | 79.27 |
|        glass        | 16.12 | 17.67 |
|        clock        | 45.09 |  49.7 |
|         flag        | 54.71 | 66.86 |
+---------------------+-------+-------+
2023-02-19 15:29:59,264 - mmseg - INFO - Summary:
2023-02-19 15:29:59,264 - mmseg - INFO - 
+------+-------+-------+
| aAcc |  mIoU |  mAcc |
+------+-------+-------+
| 84.1 | 52.54 | 64.41 |
+------+-------+-------+
2023-02-19 15:30:02,453 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_144000.pth.
2023-02-19 15:30:02,453 - mmseg - INFO - Best mIoU is 0.5254 at 144000 iter.
2023-02-19 15:30:02,453 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:30:02,454 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8410, mIoU: 0.5254, mAcc: 0.6441, IoU.wall: 0.7913, IoU.building: 0.8274, IoU.sky: 0.9430, IoU.floor: 0.8125, IoU.tree: 0.7559, IoU.ceiling: 0.8506, IoU.road: 0.8357, IoU.bed : 0.9097, IoU.windowpane: 0.6363, IoU.grass: 0.6724, IoU.cabinet: 0.6203, IoU.sidewalk: 0.6901, IoU.person: 0.8278, IoU.earth: 0.3696, IoU.door: 0.5452, IoU.table: 0.6549, IoU.mountain: 0.5873, IoU.plant: 0.5137, IoU.curtain: 0.7643, IoU.chair: 0.6535, IoU.car: 0.8580, IoU.water: 0.5579, IoU.painting: 0.7685, IoU.sofa: 0.7494, IoU.shelf: 0.4555, IoU.house: 0.3797, IoU.sea: 0.6243, IoU.mirror: 0.7329, IoU.rug: 0.5078, IoU.field: 0.3049, IoU.armchair: 0.5092, IoU.seat: 0.6464, IoU.fence: 0.4468, IoU.desk: 0.5477, IoU.rock: 0.5014, IoU.wardrobe: 0.4639, IoU.lamp: 0.6823, IoU.bathtub: 0.8135, IoU.railing: 0.3759, IoU.cushion: 0.6364, IoU.base: 0.3753, IoU.box: 0.2891, IoU.column: 0.5005, IoU.signboard: 0.4111, IoU.chest of drawers: 0.4067, IoU.counter: 0.3055, IoU.sand: 0.5630, IoU.sink: 0.7449, IoU.skyscraper: 0.5258, IoU.fireplace: 0.7920, IoU.refrigerator: 0.8394, IoU.grandstand: 0.4439, IoU.path: 0.2655, IoU.stairs: 0.3022, IoU.runway: 0.6819, IoU.case: 0.4999, IoU.pool table: 0.9403, IoU.pillow: 0.5964, IoU.screen door: 0.8241, IoU.stairway: 0.3106, IoU.river: 0.1064, IoU.bridge: 0.6693, IoU.bookcase: 0.4653, IoU.blind: 0.4427, IoU.coffee table: 0.6378, IoU.toilet: 0.8734, IoU.flower: 0.4290, IoU.book: 0.4119, IoU.hill: 0.1239, IoU.bench: 0.4771, IoU.countertop: 0.5410, IoU.stove: 0.8191, IoU.palm: 0.5654, IoU.kitchen island: 0.4782, IoU.computer: 0.7677, IoU.swivel chair: 0.4320, IoU.boat: 0.5437, IoU.bar: 0.3584, IoU.arcade machine: 0.4232, IoU.hovel: 0.4425, IoU.bus: 0.9204, IoU.towel: 0.7271, IoU.light: 0.5612, IoU.truck: 0.4179, IoU.tower: 0.3369, IoU.chandelier: 0.7191, IoU.awning: 0.3833, IoU.streetlight: 0.3520, IoU.booth: 0.5164, IoU.television receiver: 0.7391, IoU.airplane: 0.5930, IoU.dirt track: 0.1161, IoU.apparel: 0.3817, IoU.pole: 0.2658, IoU.land: 0.0457, IoU.bannister: 0.1299, IoU.escalator: 0.4332, IoU.ottoman: 0.4804, IoU.bottle: 0.3660, IoU.buffet: 0.3263, IoU.poster: 0.2754, IoU.stage: 0.1591, IoU.van: 0.4035, IoU.ship: 0.6359, IoU.fountain: 0.2439, IoU.conveyer belt: 0.8101, IoU.canopy: 0.4535, IoU.washer: 0.7237, IoU.plaything: 0.3167, IoU.swimming pool: 0.5582, IoU.stool: 0.4858, IoU.barrel: 0.4708, IoU.basket: 0.4194, IoU.waterfall: 0.5063, IoU.tent: 0.9170, IoU.bag: 0.1788, IoU.minibike: 0.7097, IoU.cradle: 0.8377, IoU.oven: 0.4374, IoU.ball: 0.5428, IoU.food: 0.5161, IoU.step: 0.0838, IoU.tank: 0.6111, IoU.trade name: 0.2322, IoU.microwave: 0.7454, IoU.pot: 0.5453, IoU.animal: 0.6214, IoU.bicycle: 0.6075, IoU.lake: 0.4911, IoU.dishwasher: 0.7004, IoU.screen: 0.5444, IoU.blanket: 0.2276, IoU.sculpture: 0.7252, IoU.hood: 0.7316, IoU.sconce: 0.5242, IoU.vase: 0.4291, IoU.traffic light: 0.3651, IoU.tray: 0.1752, IoU.ashcan: 0.4434, IoU.fan: 0.6865, IoU.pier: 0.3723, IoU.crt screen: 0.0164, IoU.plate: 0.6094, IoU.monitor: 0.0922, IoU.bulletin board: 0.3553, IoU.shower: 0.1199, IoU.radiator: 0.7312, IoU.glass: 0.1612, IoU.clock: 0.4509, IoU.flag: 0.5471, Acc.wall: 0.8847, Acc.building: 0.9201, Acc.sky: 0.9820, Acc.floor: 0.9215, Acc.tree: 0.8823, Acc.ceiling: 0.9490, Acc.road: 0.8925, Acc.bed : 0.9586, Acc.windowpane: 0.7921, Acc.grass: 0.8197, Acc.cabinet: 0.7390, Acc.sidewalk: 0.8601, Acc.person: 0.9241, Acc.earth: 0.5122, Acc.door: 0.6860, Acc.table: 0.7931, Acc.mountain: 0.7186, Acc.plant: 0.6133, Acc.curtain: 0.8752, Acc.chair: 0.8019, Acc.car: 0.9264, Acc.water: 0.6951, Acc.painting: 0.9038, Acc.sofa: 0.8573, Acc.shelf: 0.6537, Acc.house: 0.4971, Acc.sea: 0.8471, Acc.mirror: 0.8075, Acc.rug: 0.5922, Acc.field: 0.4938, Acc.armchair: 0.6846, Acc.seat: 0.8372, Acc.fence: 0.5687, Acc.desk: 0.7128, Acc.rock: 0.7377, Acc.wardrobe: 0.6790, Acc.lamp: 0.8253, Acc.bathtub: 0.8485, Acc.railing: 0.4996, Acc.cushion: 0.8157, Acc.base: 0.4978, Acc.box: 0.3531, Acc.column: 0.6418, Acc.signboard: 0.5627, Acc.chest of drawers: 0.6280, Acc.counter: 0.4186, Acc.sand: 0.7951, Acc.sink: 0.8285, Acc.skyscraper: 0.6650, Acc.fireplace: 0.9256, Acc.refrigerator: 0.9415, Acc.grandstand: 0.7708, Acc.path: 0.4024, Acc.stairs: 0.3546, Acc.runway: 0.9039, Acc.case: 0.7083, Acc.pool table: 0.9638, Acc.pillow: 0.7041, Acc.screen door: 0.8497, Acc.stairway: 0.4091, Acc.river: 0.2306, Acc.bridge: 0.8208, Acc.bookcase: 0.7039, Acc.blind: 0.4938, Acc.coffee table: 0.7657, Acc.toilet: 0.9112, Acc.flower: 0.6204, Acc.book: 0.5475, Acc.hill: 0.1918, Acc.bench: 0.5148, Acc.countertop: 0.7577, Acc.stove: 0.8653, Acc.palm: 0.7894, Acc.kitchen island: 0.7373, Acc.computer: 0.8612, Acc.swivel chair: 0.6150, Acc.boat: 0.5871, Acc.bar: 0.4631, Acc.arcade machine: 0.4505, Acc.hovel: 0.5007, Acc.bus: 0.9679, Acc.towel: 0.8221, Acc.light: 0.6238, Acc.truck: 0.5038, Acc.tower: 0.4395, Acc.chandelier: 0.8272, Acc.awning: 0.4933, Acc.streetlight: 0.4833, Acc.booth: 0.6204, Acc.television receiver: 0.8140, Acc.airplane: 0.6763, Acc.dirt track: 0.3796, Acc.apparel: 0.5834, Acc.pole: 0.3958, Acc.land: 0.0611, Acc.bannister: 0.1581, Acc.escalator: 0.5834, Acc.ottoman: 0.6765, Acc.bottle: 0.5724, Acc.buffet: 0.3658, Acc.poster: 0.3700, Acc.stage: 0.2061, Acc.van: 0.5692, Acc.ship: 0.9092, Acc.fountain: 0.2466, Acc.conveyer belt: 0.9193, Acc.canopy: 0.5361, Acc.washer: 0.7513, Acc.plaything: 0.4132, Acc.swimming pool: 0.7263, Acc.stool: 0.6208, Acc.barrel: 0.7443, Acc.basket: 0.6116, Acc.waterfall: 0.6040, Acc.tent: 0.9819, Acc.bag: 0.2200, Acc.minibike: 0.9005, Acc.cradle: 0.9258, Acc.oven: 0.6617, Acc.ball: 0.7064, Acc.food: 0.5810, Acc.step: 0.1025, Acc.tank: 0.6386, Acc.trade name: 0.2861, Acc.microwave: 0.8235, Acc.pot: 0.6387, Acc.animal: 0.6647, Acc.bicycle: 0.8115, Acc.lake: 0.5805, Acc.dishwasher: 0.7566, Acc.screen: 0.6950, Acc.blanket: 0.2669, Acc.sculpture: 0.8671, Acc.hood: 0.7706, Acc.sconce: 0.6432, Acc.vase: 0.6193, Acc.traffic light: 0.5318, Acc.tray: 0.2482, Acc.ashcan: 0.6212, Acc.fan: 0.7747, Acc.pier: 0.5868, Acc.crt screen: 0.0510, Acc.plate: 0.7813, Acc.monitor: 0.1071, Acc.bulletin board: 0.3983, Acc.shower: 0.1975, Acc.radiator: 0.7927, Acc.glass: 0.1767, Acc.clock: 0.4970, Acc.flag: 0.6686
2023-02-19 15:30:16,216 - mmseg - INFO - Iter [144050/160000]	lr: 5.982e-06, eta: 1:16:00, time: 0.625, data_time: 0.354, memory: 15214, decode.loss_ce: 0.0926, decode.acc_seg: 95.9328, aux.loss_ce: 0.0559, aux.acc_seg: 94.0321, loss: 0.1485, grad_norm: 1.3493
2023-02-19 15:30:29,801 - mmseg - INFO - Iter [144100/160000]	lr: 5.963e-06, eta: 1:15:45, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0938, decode.acc_seg: 95.7136, aux.loss_ce: 0.0547, aux.acc_seg: 93.9406, loss: 0.1486, grad_norm: 1.1729
2023-02-19 15:30:43,809 - mmseg - INFO - Iter [144150/160000]	lr: 5.944e-06, eta: 1:15:31, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0950, decode.acc_seg: 95.7806, aux.loss_ce: 0.0557, aux.acc_seg: 93.9598, loss: 0.1507, grad_norm: 1.4344
2023-02-19 15:30:57,917 - mmseg - INFO - Iter [144200/160000]	lr: 5.925e-06, eta: 1:15:17, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0930, decode.acc_seg: 95.8651, aux.loss_ce: 0.0549, aux.acc_seg: 94.0991, loss: 0.1479, grad_norm: 1.0362
2023-02-19 15:31:11,973 - mmseg - INFO - Iter [144250/160000]	lr: 5.907e-06, eta: 1:15:02, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0995, decode.acc_seg: 95.5611, aux.loss_ce: 0.0597, aux.acc_seg: 93.5781, loss: 0.1592, grad_norm: 1.4878
2023-02-19 15:31:26,379 - mmseg - INFO - Iter [144300/160000]	lr: 5.888e-06, eta: 1:14:48, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0909, decode.acc_seg: 95.8844, aux.loss_ce: 0.0543, aux.acc_seg: 94.0160, loss: 0.1451, grad_norm: 1.2114
2023-02-19 15:31:41,192 - mmseg - INFO - Iter [144350/160000]	lr: 5.869e-06, eta: 1:14:34, time: 0.296, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0979, decode.acc_seg: 95.5828, aux.loss_ce: 0.0568, aux.acc_seg: 93.7287, loss: 0.1547, grad_norm: 1.2079
2023-02-19 15:31:55,018 - mmseg - INFO - Iter [144400/160000]	lr: 5.850e-06, eta: 1:14:20, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0930, decode.acc_seg: 95.8692, aux.loss_ce: 0.0567, aux.acc_seg: 93.8900, loss: 0.1497, grad_norm: 1.5270
2023-02-19 15:32:08,832 - mmseg - INFO - Iter [144450/160000]	lr: 5.832e-06, eta: 1:14:05, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0944, decode.acc_seg: 95.7538, aux.loss_ce: 0.0579, aux.acc_seg: 93.7479, loss: 0.1522, grad_norm: 1.4826
2023-02-19 15:32:22,526 - mmseg - INFO - Iter [144500/160000]	lr: 5.813e-06, eta: 1:13:51, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0920, decode.acc_seg: 95.8460, aux.loss_ce: 0.0553, aux.acc_seg: 94.0328, loss: 0.1474, grad_norm: 1.4146
2023-02-19 15:32:37,054 - mmseg - INFO - Iter [144550/160000]	lr: 5.794e-06, eta: 1:13:37, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0917, decode.acc_seg: 95.9169, aux.loss_ce: 0.0567, aux.acc_seg: 93.7945, loss: 0.1484, grad_norm: 1.0288
2023-02-19 15:32:51,487 - mmseg - INFO - Iter [144600/160000]	lr: 5.775e-06, eta: 1:13:22, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0926, decode.acc_seg: 95.7804, aux.loss_ce: 0.0559, aux.acc_seg: 93.7132, loss: 0.1485, grad_norm: 1.3059
2023-02-19 15:33:05,244 - mmseg - INFO - Iter [144650/160000]	lr: 5.757e-06, eta: 1:13:08, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0982, decode.acc_seg: 95.7179, aux.loss_ce: 0.0591, aux.acc_seg: 93.7313, loss: 0.1573, grad_norm: 1.5653
2023-02-19 15:33:19,130 - mmseg - INFO - Iter [144700/160000]	lr: 5.738e-06, eta: 1:12:54, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0946, decode.acc_seg: 95.7666, aux.loss_ce: 0.0565, aux.acc_seg: 93.8249, loss: 0.1511, grad_norm: 1.6433
2023-02-19 15:33:32,854 - mmseg - INFO - Iter [144750/160000]	lr: 5.719e-06, eta: 1:12:39, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1030, decode.acc_seg: 95.7305, aux.loss_ce: 0.0590, aux.acc_seg: 93.8206, loss: 0.1620, grad_norm: 2.9505
2023-02-19 15:33:46,559 - mmseg - INFO - Iter [144800/160000]	lr: 5.700e-06, eta: 1:12:25, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0908, decode.acc_seg: 95.9086, aux.loss_ce: 0.0541, aux.acc_seg: 94.0949, loss: 0.1449, grad_norm: 1.4765
2023-02-19 15:34:00,577 - mmseg - INFO - Iter [144850/160000]	lr: 5.682e-06, eta: 1:12:11, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0966, decode.acc_seg: 95.8409, aux.loss_ce: 0.0566, aux.acc_seg: 94.0639, loss: 0.1532, grad_norm: 1.3956
2023-02-19 15:34:14,414 - mmseg - INFO - Iter [144900/160000]	lr: 5.663e-06, eta: 1:11:56, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0958, decode.acc_seg: 95.6554, aux.loss_ce: 0.0579, aux.acc_seg: 93.7007, loss: 0.1537, grad_norm: 1.2353
2023-02-19 15:34:28,313 - mmseg - INFO - Iter [144950/160000]	lr: 5.644e-06, eta: 1:11:42, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0935, decode.acc_seg: 95.7771, aux.loss_ce: 0.0553, aux.acc_seg: 93.9797, loss: 0.1488, grad_norm: 1.1662
2023-02-19 15:34:42,573 - mmseg - INFO - Saving checkpoint at 145000 iterations
2023-02-19 15:34:45,885 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:34:45,885 - mmseg - INFO - Iter [145000/160000]	lr: 5.625e-06, eta: 1:11:28, time: 0.352, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0934, decode.acc_seg: 95.8563, aux.loss_ce: 0.0553, aux.acc_seg: 94.0508, loss: 0.1488, grad_norm: 1.2668
2023-02-19 15:34:59,746 - mmseg - INFO - Iter [145050/160000]	lr: 5.607e-06, eta: 1:11:14, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0919, decode.acc_seg: 95.8910, aux.loss_ce: 0.0544, aux.acc_seg: 94.0491, loss: 0.1463, grad_norm: 1.2168
2023-02-19 15:35:14,069 - mmseg - INFO - Iter [145100/160000]	lr: 5.588e-06, eta: 1:10:59, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0953, decode.acc_seg: 95.7172, aux.loss_ce: 0.0557, aux.acc_seg: 93.9152, loss: 0.1510, grad_norm: 1.2692
2023-02-19 15:35:29,046 - mmseg - INFO - Iter [145150/160000]	lr: 5.569e-06, eta: 1:10:45, time: 0.300, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0959, decode.acc_seg: 95.7387, aux.loss_ce: 0.0561, aux.acc_seg: 93.9997, loss: 0.1521, grad_norm: 1.1475
2023-02-19 15:35:43,161 - mmseg - INFO - Iter [145200/160000]	lr: 5.550e-06, eta: 1:10:31, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0987, decode.acc_seg: 95.5574, aux.loss_ce: 0.0582, aux.acc_seg: 93.6343, loss: 0.1569, grad_norm: 1.8305
2023-02-19 15:35:59,061 - mmseg - INFO - Iter [145250/160000]	lr: 5.532e-06, eta: 1:10:17, time: 0.318, data_time: 0.048, memory: 15214, decode.loss_ce: 0.0927, decode.acc_seg: 95.9046, aux.loss_ce: 0.0552, aux.acc_seg: 94.0527, loss: 0.1479, grad_norm: 1.3869
2023-02-19 15:36:13,135 - mmseg - INFO - Iter [145300/160000]	lr: 5.513e-06, eta: 1:10:02, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0925, decode.acc_seg: 95.7575, aux.loss_ce: 0.0547, aux.acc_seg: 93.8783, loss: 0.1472, grad_norm: 1.4242
2023-02-19 15:36:26,829 - mmseg - INFO - Iter [145350/160000]	lr: 5.494e-06, eta: 1:09:48, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0912, decode.acc_seg: 95.8218, aux.loss_ce: 0.0544, aux.acc_seg: 93.8861, loss: 0.1455, grad_norm: 1.2396
2023-02-19 15:36:40,663 - mmseg - INFO - Iter [145400/160000]	lr: 5.475e-06, eta: 1:09:34, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0917, decode.acc_seg: 95.8511, aux.loss_ce: 0.0557, aux.acc_seg: 93.8478, loss: 0.1473, grad_norm: 1.3446
2023-02-19 15:36:54,244 - mmseg - INFO - Iter [145450/160000]	lr: 5.457e-06, eta: 1:09:19, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0922, decode.acc_seg: 95.8725, aux.loss_ce: 0.0539, aux.acc_seg: 94.1470, loss: 0.1461, grad_norm: 1.0928
2023-02-19 15:37:08,217 - mmseg - INFO - Iter [145500/160000]	lr: 5.438e-06, eta: 1:09:05, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0962, decode.acc_seg: 95.8530, aux.loss_ce: 0.0598, aux.acc_seg: 93.7232, loss: 0.1560, grad_norm: 1.1590
2023-02-19 15:37:22,590 - mmseg - INFO - Iter [145550/160000]	lr: 5.419e-06, eta: 1:08:51, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0970, decode.acc_seg: 95.6969, aux.loss_ce: 0.0586, aux.acc_seg: 93.7696, loss: 0.1557, grad_norm: 1.5589
2023-02-19 15:37:36,197 - mmseg - INFO - Iter [145600/160000]	lr: 5.400e-06, eta: 1:08:36, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0974, decode.acc_seg: 95.7271, aux.loss_ce: 0.0588, aux.acc_seg: 93.7920, loss: 0.1563, grad_norm: 1.6721
2023-02-19 15:37:49,785 - mmseg - INFO - Iter [145650/160000]	lr: 5.382e-06, eta: 1:08:22, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0992, decode.acc_seg: 95.5836, aux.loss_ce: 0.0579, aux.acc_seg: 93.6219, loss: 0.1570, grad_norm: 1.5890
2023-02-19 15:38:03,695 - mmseg - INFO - Iter [145700/160000]	lr: 5.363e-06, eta: 1:08:08, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0972, decode.acc_seg: 95.7045, aux.loss_ce: 0.0576, aux.acc_seg: 93.8694, loss: 0.1548, grad_norm: 1.4362
2023-02-19 15:38:17,786 - mmseg - INFO - Iter [145750/160000]	lr: 5.344e-06, eta: 1:07:53, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0992, decode.acc_seg: 95.5616, aux.loss_ce: 0.0604, aux.acc_seg: 93.4097, loss: 0.1596, grad_norm: 1.5819
2023-02-19 15:38:31,855 - mmseg - INFO - Iter [145800/160000]	lr: 5.325e-06, eta: 1:07:39, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0947, decode.acc_seg: 95.8024, aux.loss_ce: 0.0559, aux.acc_seg: 93.9207, loss: 0.1506, grad_norm: 1.2289
2023-02-19 15:38:46,089 - mmseg - INFO - Iter [145850/160000]	lr: 5.307e-06, eta: 1:07:25, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0934, decode.acc_seg: 95.7590, aux.loss_ce: 0.0559, aux.acc_seg: 93.9359, loss: 0.1493, grad_norm: 1.2951
2023-02-19 15:39:00,438 - mmseg - INFO - Iter [145900/160000]	lr: 5.288e-06, eta: 1:07:10, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0974, decode.acc_seg: 95.7026, aux.loss_ce: 0.0571, aux.acc_seg: 93.8861, loss: 0.1545, grad_norm: 1.3367
2023-02-19 15:39:13,949 - mmseg - INFO - Iter [145950/160000]	lr: 5.269e-06, eta: 1:06:56, time: 0.270, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0929, decode.acc_seg: 95.8529, aux.loss_ce: 0.0565, aux.acc_seg: 93.9151, loss: 0.1494, grad_norm: 1.1880
2023-02-19 15:39:27,516 - mmseg - INFO - Saving checkpoint at 146000 iterations
2023-02-19 15:39:30,821 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:39:30,821 - mmseg - INFO - Iter [146000/160000]	lr: 5.250e-06, eta: 1:06:42, time: 0.338, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0919, decode.acc_seg: 95.7979, aux.loss_ce: 0.0550, aux.acc_seg: 93.9222, loss: 0.1469, grad_norm: 1.1936
2023-02-19 15:39:44,627 - mmseg - INFO - Iter [146050/160000]	lr: 5.232e-06, eta: 1:06:28, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0944, decode.acc_seg: 95.7287, aux.loss_ce: 0.0544, aux.acc_seg: 94.0048, loss: 0.1487, grad_norm: 1.0966
2023-02-19 15:39:58,471 - mmseg - INFO - Iter [146100/160000]	lr: 5.213e-06, eta: 1:06:13, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0900, decode.acc_seg: 95.9496, aux.loss_ce: 0.0554, aux.acc_seg: 93.9844, loss: 0.1454, grad_norm: 1.3322
2023-02-19 15:40:12,417 - mmseg - INFO - Iter [146150/160000]	lr: 5.194e-06, eta: 1:05:59, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0909, decode.acc_seg: 95.8959, aux.loss_ce: 0.0556, aux.acc_seg: 94.0111, loss: 0.1465, grad_norm: 1.2023
2023-02-19 15:40:27,170 - mmseg - INFO - Iter [146200/160000]	lr: 5.175e-06, eta: 1:05:45, time: 0.295, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0966, decode.acc_seg: 95.5994, aux.loss_ce: 0.0570, aux.acc_seg: 93.7120, loss: 0.1536, grad_norm: 1.3049
2023-02-19 15:40:41,348 - mmseg - INFO - Iter [146250/160000]	lr: 5.157e-06, eta: 1:05:30, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0930, decode.acc_seg: 95.9089, aux.loss_ce: 0.0564, aux.acc_seg: 93.9784, loss: 0.1493, grad_norm: 1.1918
2023-02-19 15:40:55,587 - mmseg - INFO - Iter [146300/160000]	lr: 5.138e-06, eta: 1:05:16, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0954, decode.acc_seg: 95.7138, aux.loss_ce: 0.0590, aux.acc_seg: 93.5468, loss: 0.1544, grad_norm: 1.2522
2023-02-19 15:41:09,472 - mmseg - INFO - Iter [146350/160000]	lr: 5.119e-06, eta: 1:05:02, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0926, decode.acc_seg: 95.8537, aux.loss_ce: 0.0564, aux.acc_seg: 93.8835, loss: 0.1489, grad_norm: 1.3063
2023-02-19 15:41:23,059 - mmseg - INFO - Iter [146400/160000]	lr: 5.100e-06, eta: 1:04:47, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0922, decode.acc_seg: 95.8071, aux.loss_ce: 0.0559, aux.acc_seg: 93.8784, loss: 0.1482, grad_norm: 1.0472
2023-02-19 15:41:37,186 - mmseg - INFO - Iter [146450/160000]	lr: 5.082e-06, eta: 1:04:33, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0954, decode.acc_seg: 95.6697, aux.loss_ce: 0.0562, aux.acc_seg: 93.7730, loss: 0.1517, grad_norm: 1.5797
2023-02-19 15:41:51,159 - mmseg - INFO - Iter [146500/160000]	lr: 5.063e-06, eta: 1:04:19, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0942, decode.acc_seg: 95.8167, aux.loss_ce: 0.0552, aux.acc_seg: 94.0457, loss: 0.1493, grad_norm: 1.3176
2023-02-19 15:42:07,556 - mmseg - INFO - Iter [146550/160000]	lr: 5.044e-06, eta: 1:04:05, time: 0.328, data_time: 0.048, memory: 15214, decode.loss_ce: 0.0947, decode.acc_seg: 95.7804, aux.loss_ce: 0.0556, aux.acc_seg: 94.0239, loss: 0.1503, grad_norm: 1.2631
2023-02-19 15:42:21,324 - mmseg - INFO - Iter [146600/160000]	lr: 5.025e-06, eta: 1:03:50, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0964, decode.acc_seg: 95.7024, aux.loss_ce: 0.0575, aux.acc_seg: 93.8535, loss: 0.1539, grad_norm: 1.3990
2023-02-19 15:42:34,943 - mmseg - INFO - Iter [146650/160000]	lr: 5.007e-06, eta: 1:03:36, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0877, decode.acc_seg: 95.9711, aux.loss_ce: 0.0530, aux.acc_seg: 94.1089, loss: 0.1407, grad_norm: 1.2110
2023-02-19 15:42:48,573 - mmseg - INFO - Iter [146700/160000]	lr: 4.988e-06, eta: 1:03:22, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0981, decode.acc_seg: 95.7043, aux.loss_ce: 0.0575, aux.acc_seg: 93.7565, loss: 0.1556, grad_norm: 1.7976
2023-02-19 15:43:02,317 - mmseg - INFO - Iter [146750/160000]	lr: 4.969e-06, eta: 1:03:07, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0915, decode.acc_seg: 95.9282, aux.loss_ce: 0.0562, aux.acc_seg: 93.9071, loss: 0.1477, grad_norm: 1.2281
2023-02-19 15:43:15,936 - mmseg - INFO - Iter [146800/160000]	lr: 4.950e-06, eta: 1:02:53, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0964, decode.acc_seg: 95.6683, aux.loss_ce: 0.0573, aux.acc_seg: 93.7421, loss: 0.1537, grad_norm: 1.3563
2023-02-19 15:43:29,901 - mmseg - INFO - Iter [146850/160000]	lr: 4.932e-06, eta: 1:02:39, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0947, decode.acc_seg: 95.8395, aux.loss_ce: 0.0576, aux.acc_seg: 93.9079, loss: 0.1523, grad_norm: 1.5934
2023-02-19 15:43:44,753 - mmseg - INFO - Iter [146900/160000]	lr: 4.913e-06, eta: 1:02:24, time: 0.298, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0950, decode.acc_seg: 95.8117, aux.loss_ce: 0.0567, aux.acc_seg: 93.9077, loss: 0.1517, grad_norm: 0.9801
2023-02-19 15:43:59,333 - mmseg - INFO - Iter [146950/160000]	lr: 4.894e-06, eta: 1:02:10, time: 0.292, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0914, decode.acc_seg: 95.8291, aux.loss_ce: 0.0540, aux.acc_seg: 94.0186, loss: 0.1454, grad_norm: 1.0880
2023-02-19 15:44:13,132 - mmseg - INFO - Saving checkpoint at 147000 iterations
2023-02-19 15:44:16,377 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:44:16,378 - mmseg - INFO - Iter [147000/160000]	lr: 4.875e-06, eta: 1:01:56, time: 0.341, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0916, decode.acc_seg: 95.8555, aux.loss_ce: 0.0570, aux.acc_seg: 93.7587, loss: 0.1486, grad_norm: 1.2330
2023-02-19 15:44:30,303 - mmseg - INFO - Iter [147050/160000]	lr: 4.857e-06, eta: 1:01:42, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0951, decode.acc_seg: 95.7171, aux.loss_ce: 0.0568, aux.acc_seg: 93.7042, loss: 0.1520, grad_norm: 1.3551
2023-02-19 15:44:44,480 - mmseg - INFO - Iter [147100/160000]	lr: 4.838e-06, eta: 1:01:27, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0934, decode.acc_seg: 95.8637, aux.loss_ce: 0.0558, aux.acc_seg: 94.0000, loss: 0.1492, grad_norm: 1.5462
2023-02-19 15:44:58,252 - mmseg - INFO - Iter [147150/160000]	lr: 4.819e-06, eta: 1:01:13, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0937, decode.acc_seg: 95.7935, aux.loss_ce: 0.0544, aux.acc_seg: 93.9858, loss: 0.1482, grad_norm: 1.3204
2023-02-19 15:45:12,473 - mmseg - INFO - Iter [147200/160000]	lr: 4.800e-06, eta: 1:00:59, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0901, decode.acc_seg: 96.0097, aux.loss_ce: 0.0564, aux.acc_seg: 93.9837, loss: 0.1464, grad_norm: 1.4070
2023-02-19 15:45:26,640 - mmseg - INFO - Iter [147250/160000]	lr: 4.782e-06, eta: 1:00:44, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.7973, aux.loss_ce: 0.0564, aux.acc_seg: 93.8832, loss: 0.1496, grad_norm: 1.1236
2023-02-19 15:45:40,899 - mmseg - INFO - Iter [147300/160000]	lr: 4.763e-06, eta: 1:00:30, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0935, decode.acc_seg: 95.8145, aux.loss_ce: 0.0558, aux.acc_seg: 93.9305, loss: 0.1493, grad_norm: 1.3558
2023-02-19 15:45:54,894 - mmseg - INFO - Iter [147350/160000]	lr: 4.744e-06, eta: 1:00:16, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0991, decode.acc_seg: 95.5470, aux.loss_ce: 0.0583, aux.acc_seg: 93.6550, loss: 0.1574, grad_norm: 1.1537
2023-02-19 15:46:08,878 - mmseg - INFO - Iter [147400/160000]	lr: 4.725e-06, eta: 1:00:02, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0922, decode.acc_seg: 95.9324, aux.loss_ce: 0.0570, aux.acc_seg: 93.8779, loss: 0.1492, grad_norm: 1.1866
2023-02-19 15:46:22,471 - mmseg - INFO - Iter [147450/160000]	lr: 4.707e-06, eta: 0:59:47, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0940, decode.acc_seg: 95.7629, aux.loss_ce: 0.0559, aux.acc_seg: 93.8214, loss: 0.1499, grad_norm: 1.1888
2023-02-19 15:46:36,364 - mmseg - INFO - Iter [147500/160000]	lr: 4.688e-06, eta: 0:59:33, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0894, decode.acc_seg: 95.9645, aux.loss_ce: 0.0540, aux.acc_seg: 94.0770, loss: 0.1434, grad_norm: 1.0678
2023-02-19 15:46:49,941 - mmseg - INFO - Iter [147550/160000]	lr: 4.669e-06, eta: 0:59:19, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0940, decode.acc_seg: 95.7265, aux.loss_ce: 0.0565, aux.acc_seg: 93.8083, loss: 0.1505, grad_norm: 1.2133
2023-02-19 15:47:04,363 - mmseg - INFO - Iter [147600/160000]	lr: 4.650e-06, eta: 0:59:04, time: 0.288, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0946, decode.acc_seg: 95.7503, aux.loss_ce: 0.0558, aux.acc_seg: 93.9674, loss: 0.1504, grad_norm: 1.2997
2023-02-19 15:47:18,192 - mmseg - INFO - Iter [147650/160000]	lr: 4.632e-06, eta: 0:58:50, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0966, decode.acc_seg: 95.7260, aux.loss_ce: 0.0567, aux.acc_seg: 93.8487, loss: 0.1533, grad_norm: 1.2246
2023-02-19 15:47:31,837 - mmseg - INFO - Iter [147700/160000]	lr: 4.613e-06, eta: 0:58:36, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0938, decode.acc_seg: 95.8147, aux.loss_ce: 0.0569, aux.acc_seg: 93.8349, loss: 0.1507, grad_norm: 1.2553
2023-02-19 15:47:45,589 - mmseg - INFO - Iter [147750/160000]	lr: 4.594e-06, eta: 0:58:21, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0929, decode.acc_seg: 95.8787, aux.loss_ce: 0.0550, aux.acc_seg: 94.0834, loss: 0.1479, grad_norm: 1.1148
2023-02-19 15:48:01,851 - mmseg - INFO - Iter [147800/160000]	lr: 4.575e-06, eta: 0:58:07, time: 0.325, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0933, decode.acc_seg: 95.8365, aux.loss_ce: 0.0562, aux.acc_seg: 93.8177, loss: 0.1494, grad_norm: 1.2968
2023-02-19 15:48:15,449 - mmseg - INFO - Iter [147850/160000]	lr: 4.557e-06, eta: 0:57:53, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0907, decode.acc_seg: 95.8851, aux.loss_ce: 0.0540, aux.acc_seg: 94.0915, loss: 0.1446, grad_norm: 1.0206
2023-02-19 15:48:29,602 - mmseg - INFO - Iter [147900/160000]	lr: 4.538e-06, eta: 0:57:38, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0903, decode.acc_seg: 95.9160, aux.loss_ce: 0.0530, aux.acc_seg: 94.1200, loss: 0.1433, grad_norm: 1.1237
2023-02-19 15:48:44,130 - mmseg - INFO - Iter [147950/160000]	lr: 4.519e-06, eta: 0:57:24, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0917, decode.acc_seg: 95.8284, aux.loss_ce: 0.0558, aux.acc_seg: 93.7962, loss: 0.1476, grad_norm: 1.1416
2023-02-19 15:48:58,972 - mmseg - INFO - Saving checkpoint at 148000 iterations
2023-02-19 15:49:02,284 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:49:02,284 - mmseg - INFO - Iter [148000/160000]	lr: 4.500e-06, eta: 0:57:10, time: 0.364, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.7638, aux.loss_ce: 0.0572, aux.acc_seg: 93.7563, loss: 0.1514, grad_norm: 1.2149
2023-02-19 15:49:16,282 - mmseg - INFO - Iter [148050/160000]	lr: 4.482e-06, eta: 0:56:56, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.7807, aux.loss_ce: 0.0565, aux.acc_seg: 93.8234, loss: 0.1506, grad_norm: 1.3129
2023-02-19 15:49:30,199 - mmseg - INFO - Iter [148100/160000]	lr: 4.463e-06, eta: 0:56:42, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0947, decode.acc_seg: 95.8315, aux.loss_ce: 0.0578, aux.acc_seg: 93.8221, loss: 0.1525, grad_norm: 1.5658
2023-02-19 15:49:44,405 - mmseg - INFO - Iter [148150/160000]	lr: 4.444e-06, eta: 0:56:27, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0913, decode.acc_seg: 95.9657, aux.loss_ce: 0.0532, aux.acc_seg: 94.2709, loss: 0.1445, grad_norm: 2.1322
2023-02-19 15:49:58,462 - mmseg - INFO - Iter [148200/160000]	lr: 4.425e-06, eta: 0:56:13, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1012, decode.acc_seg: 95.5183, aux.loss_ce: 0.0619, aux.acc_seg: 93.2916, loss: 0.1631, grad_norm: 1.6993
2023-02-19 15:50:12,286 - mmseg - INFO - Iter [148250/160000]	lr: 4.407e-06, eta: 0:55:59, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0963, decode.acc_seg: 95.6340, aux.loss_ce: 0.0578, aux.acc_seg: 93.6605, loss: 0.1541, grad_norm: 1.5284
2023-02-19 15:50:25,904 - mmseg - INFO - Iter [148300/160000]	lr: 4.388e-06, eta: 0:55:44, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0953, decode.acc_seg: 95.7384, aux.loss_ce: 0.0554, aux.acc_seg: 93.9899, loss: 0.1507, grad_norm: 1.2186
2023-02-19 15:50:39,758 - mmseg - INFO - Iter [148350/160000]	lr: 4.369e-06, eta: 0:55:30, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0937, decode.acc_seg: 95.8670, aux.loss_ce: 0.0567, aux.acc_seg: 93.9799, loss: 0.1504, grad_norm: 1.3232
2023-02-19 15:50:54,288 - mmseg - INFO - Iter [148400/160000]	lr: 4.350e-06, eta: 0:55:16, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.7797, aux.loss_ce: 0.0574, aux.acc_seg: 93.7813, loss: 0.1515, grad_norm: 1.2084
2023-02-19 15:51:08,420 - mmseg - INFO - Iter [148450/160000]	lr: 4.332e-06, eta: 0:55:01, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0938, decode.acc_seg: 95.7503, aux.loss_ce: 0.0554, aux.acc_seg: 93.9660, loss: 0.1492, grad_norm: 1.3251
2023-02-19 15:51:22,069 - mmseg - INFO - Iter [148500/160000]	lr: 4.313e-06, eta: 0:54:47, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0919, decode.acc_seg: 95.9483, aux.loss_ce: 0.0555, aux.acc_seg: 94.0053, loss: 0.1474, grad_norm: 1.2051
2023-02-19 15:51:36,590 - mmseg - INFO - Iter [148550/160000]	lr: 4.294e-06, eta: 0:54:33, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0929, decode.acc_seg: 95.8934, aux.loss_ce: 0.0563, aux.acc_seg: 93.9888, loss: 0.1493, grad_norm: 1.5568
2023-02-19 15:51:50,447 - mmseg - INFO - Iter [148600/160000]	lr: 4.275e-06, eta: 0:54:18, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.7433, aux.loss_ce: 0.0582, aux.acc_seg: 93.6590, loss: 0.1523, grad_norm: 1.4255
2023-02-19 15:52:04,550 - mmseg - INFO - Iter [148650/160000]	lr: 4.257e-06, eta: 0:54:04, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0929, decode.acc_seg: 95.8185, aux.loss_ce: 0.0574, aux.acc_seg: 93.7820, loss: 0.1504, grad_norm: 1.8706
2023-02-19 15:52:18,346 - mmseg - INFO - Iter [148700/160000]	lr: 4.238e-06, eta: 0:53:50, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0920, decode.acc_seg: 95.8820, aux.loss_ce: 0.0560, aux.acc_seg: 93.8830, loss: 0.1480, grad_norm: 1.4749
2023-02-19 15:52:32,183 - mmseg - INFO - Iter [148750/160000]	lr: 4.219e-06, eta: 0:53:35, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0946, decode.acc_seg: 95.7960, aux.loss_ce: 0.0571, aux.acc_seg: 93.7978, loss: 0.1516, grad_norm: 1.3194
2023-02-19 15:52:45,820 - mmseg - INFO - Iter [148800/160000]	lr: 4.200e-06, eta: 0:53:21, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0886, decode.acc_seg: 96.0393, aux.loss_ce: 0.0549, aux.acc_seg: 93.9989, loss: 0.1436, grad_norm: 1.2431
2023-02-19 15:52:59,616 - mmseg - INFO - Iter [148850/160000]	lr: 4.182e-06, eta: 0:53:07, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0919, decode.acc_seg: 95.8741, aux.loss_ce: 0.0573, aux.acc_seg: 93.7179, loss: 0.1491, grad_norm: 1.2748
2023-02-19 15:53:13,452 - mmseg - INFO - Iter [148900/160000]	lr: 4.163e-06, eta: 0:52:52, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0950, decode.acc_seg: 95.7417, aux.loss_ce: 0.0567, aux.acc_seg: 93.7959, loss: 0.1517, grad_norm: 1.7603
2023-02-19 15:53:27,211 - mmseg - INFO - Iter [148950/160000]	lr: 4.144e-06, eta: 0:52:38, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0949, decode.acc_seg: 95.8846, aux.loss_ce: 0.0576, aux.acc_seg: 93.9452, loss: 0.1525, grad_norm: 1.2012
2023-02-19 15:53:41,131 - mmseg - INFO - Saving checkpoint at 149000 iterations
2023-02-19 15:53:44,404 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:53:44,405 - mmseg - INFO - Iter [149000/160000]	lr: 4.125e-06, eta: 0:52:24, time: 0.344, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0952, decode.acc_seg: 95.7291, aux.loss_ce: 0.0562, aux.acc_seg: 93.7940, loss: 0.1514, grad_norm: 1.1683
2023-02-19 15:54:00,163 - mmseg - INFO - Iter [149050/160000]	lr: 4.107e-06, eta: 0:52:10, time: 0.315, data_time: 0.048, memory: 15214, decode.loss_ce: 0.0921, decode.acc_seg: 95.8767, aux.loss_ce: 0.0555, aux.acc_seg: 93.9995, loss: 0.1476, grad_norm: 1.3531
2023-02-19 15:54:14,667 - mmseg - INFO - Iter [149100/160000]	lr: 4.088e-06, eta: 0:51:56, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0942, decode.acc_seg: 95.8065, aux.loss_ce: 0.0572, aux.acc_seg: 93.8794, loss: 0.1514, grad_norm: 1.3094
2023-02-19 15:54:28,994 - mmseg - INFO - Iter [149150/160000]	lr: 4.069e-06, eta: 0:51:41, time: 0.287, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0920, decode.acc_seg: 95.8358, aux.loss_ce: 0.0550, aux.acc_seg: 93.9808, loss: 0.1470, grad_norm: 1.2238
2023-02-19 15:54:42,684 - mmseg - INFO - Iter [149200/160000]	lr: 4.050e-06, eta: 0:51:27, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0867, decode.acc_seg: 96.1369, aux.loss_ce: 0.0548, aux.acc_seg: 94.0084, loss: 0.1415, grad_norm: 1.2563
2023-02-19 15:54:56,986 - mmseg - INFO - Iter [149250/160000]	lr: 4.032e-06, eta: 0:51:13, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0933, decode.acc_seg: 95.8252, aux.loss_ce: 0.0565, aux.acc_seg: 93.8355, loss: 0.1498, grad_norm: 1.1963
2023-02-19 15:55:10,572 - mmseg - INFO - Iter [149300/160000]	lr: 4.013e-06, eta: 0:50:58, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0958, decode.acc_seg: 95.7027, aux.loss_ce: 0.0570, aux.acc_seg: 93.7562, loss: 0.1528, grad_norm: 1.1095
2023-02-19 15:55:24,432 - mmseg - INFO - Iter [149350/160000]	lr: 3.994e-06, eta: 0:50:44, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0952, decode.acc_seg: 95.6566, aux.loss_ce: 0.0578, aux.acc_seg: 93.6624, loss: 0.1531, grad_norm: 1.3236
2023-02-19 15:55:38,495 - mmseg - INFO - Iter [149400/160000]	lr: 3.975e-06, eta: 0:50:30, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0915, decode.acc_seg: 95.8540, aux.loss_ce: 0.0540, aux.acc_seg: 94.0931, loss: 0.1455, grad_norm: 1.2348
2023-02-19 15:55:52,351 - mmseg - INFO - Iter [149450/160000]	lr: 3.957e-06, eta: 0:50:15, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0916, decode.acc_seg: 95.8408, aux.loss_ce: 0.0556, aux.acc_seg: 93.9463, loss: 0.1472, grad_norm: 1.3923
2023-02-19 15:56:06,154 - mmseg - INFO - Iter [149500/160000]	lr: 3.938e-06, eta: 0:50:01, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0896, decode.acc_seg: 95.9659, aux.loss_ce: 0.0538, aux.acc_seg: 94.1572, loss: 0.1434, grad_norm: 1.0345
2023-02-19 15:56:19,871 - mmseg - INFO - Iter [149550/160000]	lr: 3.919e-06, eta: 0:49:47, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.8962, aux.loss_ce: 0.0564, aux.acc_seg: 93.9891, loss: 0.1506, grad_norm: 1.4566
2023-02-19 15:56:33,795 - mmseg - INFO - Iter [149600/160000]	lr: 3.900e-06, eta: 0:49:32, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.6964, aux.loss_ce: 0.0554, aux.acc_seg: 93.8229, loss: 0.1495, grad_norm: 1.1378
2023-02-19 15:56:47,452 - mmseg - INFO - Iter [149650/160000]	lr: 3.882e-06, eta: 0:49:18, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0873, decode.acc_seg: 96.0795, aux.loss_ce: 0.0529, aux.acc_seg: 94.2266, loss: 0.1401, grad_norm: 1.2490
2023-02-19 15:57:01,635 - mmseg - INFO - Iter [149700/160000]	lr: 3.863e-06, eta: 0:49:04, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0965, decode.acc_seg: 95.6801, aux.loss_ce: 0.0559, aux.acc_seg: 93.9421, loss: 0.1524, grad_norm: 1.2981
2023-02-19 15:57:15,335 - mmseg - INFO - Iter [149750/160000]	lr: 3.844e-06, eta: 0:48:49, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0990, decode.acc_seg: 95.5724, aux.loss_ce: 0.0594, aux.acc_seg: 93.5267, loss: 0.1585, grad_norm: 1.3261
2023-02-19 15:57:29,714 - mmseg - INFO - Iter [149800/160000]	lr: 3.825e-06, eta: 0:48:35, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0933, decode.acc_seg: 95.8627, aux.loss_ce: 0.0562, aux.acc_seg: 93.9025, loss: 0.1495, grad_norm: 1.1368
2023-02-19 15:57:43,329 - mmseg - INFO - Iter [149850/160000]	lr: 3.807e-06, eta: 0:48:21, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0912, decode.acc_seg: 95.9115, aux.loss_ce: 0.0562, aux.acc_seg: 93.8636, loss: 0.1474, grad_norm: 1.2756
2023-02-19 15:57:56,919 - mmseg - INFO - Iter [149900/160000]	lr: 3.788e-06, eta: 0:48:06, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0908, decode.acc_seg: 95.8900, aux.loss_ce: 0.0549, aux.acc_seg: 93.9747, loss: 0.1457, grad_norm: 1.1596
2023-02-19 15:58:10,584 - mmseg - INFO - Iter [149950/160000]	lr: 3.769e-06, eta: 0:47:52, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0917, decode.acc_seg: 95.8263, aux.loss_ce: 0.0543, aux.acc_seg: 94.0176, loss: 0.1459, grad_norm: 1.4619
2023-02-19 15:58:24,183 - mmseg - INFO - Saving checkpoint at 150000 iterations
2023-02-19 15:58:27,485 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 15:58:27,485 - mmseg - INFO - Iter [150000/160000]	lr: 3.750e-06, eta: 0:47:38, time: 0.338, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0908, decode.acc_seg: 95.8792, aux.loss_ce: 0.0547, aux.acc_seg: 93.9670, loss: 0.1456, grad_norm: 1.1164
2023-02-19 15:58:41,282 - mmseg - INFO - Iter [150050/160000]	lr: 3.732e-06, eta: 0:47:24, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0894, decode.acc_seg: 96.0046, aux.loss_ce: 0.0535, aux.acc_seg: 94.2416, loss: 0.1429, grad_norm: 1.1844
2023-02-19 15:58:55,091 - mmseg - INFO - Iter [150100/160000]	lr: 3.713e-06, eta: 0:47:09, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0951, decode.acc_seg: 95.7222, aux.loss_ce: 0.0580, aux.acc_seg: 93.5247, loss: 0.1531, grad_norm: 1.4812
2023-02-19 15:59:08,713 - mmseg - INFO - Iter [150150/160000]	lr: 3.694e-06, eta: 0:46:55, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0937, decode.acc_seg: 95.7685, aux.loss_ce: 0.0561, aux.acc_seg: 93.8057, loss: 0.1498, grad_norm: 1.1646
2023-02-19 15:59:22,483 - mmseg - INFO - Iter [150200/160000]	lr: 3.675e-06, eta: 0:46:41, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0915, decode.acc_seg: 95.8646, aux.loss_ce: 0.0570, aux.acc_seg: 93.8577, loss: 0.1485, grad_norm: 1.2053
2023-02-19 15:59:36,124 - mmseg - INFO - Iter [150250/160000]	lr: 3.657e-06, eta: 0:46:26, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0935, decode.acc_seg: 95.8260, aux.loss_ce: 0.0566, aux.acc_seg: 93.8733, loss: 0.1500, grad_norm: 1.2273
2023-02-19 15:59:52,573 - mmseg - INFO - Iter [150300/160000]	lr: 3.638e-06, eta: 0:46:12, time: 0.329, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0881, decode.acc_seg: 96.0754, aux.loss_ce: 0.0536, aux.acc_seg: 94.1685, loss: 0.1417, grad_norm: 1.3884
2023-02-19 16:00:06,619 - mmseg - INFO - Iter [150350/160000]	lr: 3.619e-06, eta: 0:45:58, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.8667, aux.loss_ce: 0.0561, aux.acc_seg: 93.9230, loss: 0.1493, grad_norm: 1.1974
2023-02-19 16:00:20,359 - mmseg - INFO - Iter [150400/160000]	lr: 3.600e-06, eta: 0:45:44, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0958, decode.acc_seg: 95.7488, aux.loss_ce: 0.0579, aux.acc_seg: 93.7862, loss: 0.1536, grad_norm: 1.4965
2023-02-19 16:00:34,525 - mmseg - INFO - Iter [150450/160000]	lr: 3.582e-06, eta: 0:45:29, time: 0.283, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0892, decode.acc_seg: 95.9788, aux.loss_ce: 0.0536, aux.acc_seg: 94.1818, loss: 0.1429, grad_norm: 1.2040
2023-02-19 16:00:49,594 - mmseg - INFO - Iter [150500/160000]	lr: 3.563e-06, eta: 0:45:15, time: 0.302, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0902, decode.acc_seg: 95.9378, aux.loss_ce: 0.0546, aux.acc_seg: 94.0974, loss: 0.1448, grad_norm: 1.3066
2023-02-19 16:01:03,736 - mmseg - INFO - Iter [150550/160000]	lr: 3.544e-06, eta: 0:45:01, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0919, decode.acc_seg: 95.8004, aux.loss_ce: 0.0551, aux.acc_seg: 93.9309, loss: 0.1470, grad_norm: 1.3765
2023-02-19 16:01:17,439 - mmseg - INFO - Iter [150600/160000]	lr: 3.525e-06, eta: 0:44:46, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0944, decode.acc_seg: 95.8481, aux.loss_ce: 0.0576, aux.acc_seg: 93.8254, loss: 0.1521, grad_norm: 1.2961
2023-02-19 16:01:31,709 - mmseg - INFO - Iter [150650/160000]	lr: 3.507e-06, eta: 0:44:32, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0955, decode.acc_seg: 95.7825, aux.loss_ce: 0.0569, aux.acc_seg: 93.8381, loss: 0.1524, grad_norm: 1.5495
2023-02-19 16:01:45,620 - mmseg - INFO - Iter [150700/160000]	lr: 3.488e-06, eta: 0:44:18, time: 0.279, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.8298, aux.loss_ce: 0.0562, aux.acc_seg: 93.8442, loss: 0.1494, grad_norm: 1.2226
2023-02-19 16:01:59,682 - mmseg - INFO - Iter [150750/160000]	lr: 3.469e-06, eta: 0:44:03, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0894, decode.acc_seg: 95.9796, aux.loss_ce: 0.0558, aux.acc_seg: 94.0419, loss: 0.1451, grad_norm: 1.1663
2023-02-19 16:02:13,988 - mmseg - INFO - Iter [150800/160000]	lr: 3.450e-06, eta: 0:43:49, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0949, decode.acc_seg: 95.7452, aux.loss_ce: 0.0568, aux.acc_seg: 93.8256, loss: 0.1517, grad_norm: 1.4768
2023-02-19 16:02:28,062 - mmseg - INFO - Iter [150850/160000]	lr: 3.432e-06, eta: 0:43:35, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0943, decode.acc_seg: 95.6689, aux.loss_ce: 0.0570, aux.acc_seg: 93.7028, loss: 0.1513, grad_norm: 1.4006
2023-02-19 16:02:42,320 - mmseg - INFO - Iter [150900/160000]	lr: 3.413e-06, eta: 0:43:21, time: 0.285, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0942, decode.acc_seg: 95.8167, aux.loss_ce: 0.0569, aux.acc_seg: 93.8512, loss: 0.1511, grad_norm: 1.3505
2023-02-19 16:02:56,789 - mmseg - INFO - Iter [150950/160000]	lr: 3.394e-06, eta: 0:43:06, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0923, decode.acc_seg: 95.8681, aux.loss_ce: 0.0551, aux.acc_seg: 94.0515, loss: 0.1475, grad_norm: 1.3684
2023-02-19 16:03:10,552 - mmseg - INFO - Saving checkpoint at 151000 iterations
2023-02-19 16:03:13,798 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:03:13,798 - mmseg - INFO - Iter [151000/160000]	lr: 3.375e-06, eta: 0:42:52, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0960, decode.acc_seg: 95.7609, aux.loss_ce: 0.0593, aux.acc_seg: 93.6892, loss: 0.1553, grad_norm: 1.2957
2023-02-19 16:03:27,628 - mmseg - INFO - Iter [151050/160000]	lr: 3.357e-06, eta: 0:42:38, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0920, decode.acc_seg: 95.9385, aux.loss_ce: 0.0571, aux.acc_seg: 93.8596, loss: 0.1491, grad_norm: 1.4917
2023-02-19 16:03:41,286 - mmseg - INFO - Iter [151100/160000]	lr: 3.338e-06, eta: 0:42:23, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0897, decode.acc_seg: 95.9060, aux.loss_ce: 0.0537, aux.acc_seg: 94.0524, loss: 0.1434, grad_norm: 1.1363
2023-02-19 16:03:55,456 - mmseg - INFO - Iter [151150/160000]	lr: 3.319e-06, eta: 0:42:09, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0988, decode.acc_seg: 95.7134, aux.loss_ce: 0.0579, aux.acc_seg: 93.8153, loss: 0.1567, grad_norm: 1.7429
2023-02-19 16:04:09,537 - mmseg - INFO - Iter [151200/160000]	lr: 3.300e-06, eta: 0:41:55, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0937, decode.acc_seg: 95.7272, aux.loss_ce: 0.0569, aux.acc_seg: 93.8022, loss: 0.1506, grad_norm: 1.6222
2023-02-19 16:04:23,307 - mmseg - INFO - Iter [151250/160000]	lr: 3.282e-06, eta: 0:41:41, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0894, decode.acc_seg: 95.9490, aux.loss_ce: 0.0543, aux.acc_seg: 94.0062, loss: 0.1437, grad_norm: 1.1137
2023-02-19 16:04:37,236 - mmseg - INFO - Iter [151300/160000]	lr: 3.263e-06, eta: 0:41:26, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0886, decode.acc_seg: 95.9596, aux.loss_ce: 0.0535, aux.acc_seg: 94.0652, loss: 0.1421, grad_norm: 1.0579
2023-02-19 16:04:52,041 - mmseg - INFO - Iter [151350/160000]	lr: 3.244e-06, eta: 0:41:12, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0915, decode.acc_seg: 95.9042, aux.loss_ce: 0.0560, aux.acc_seg: 93.9542, loss: 0.1475, grad_norm: 1.0446
2023-02-19 16:05:06,001 - mmseg - INFO - Iter [151400/160000]	lr: 3.225e-06, eta: 0:40:58, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0914, decode.acc_seg: 95.9540, aux.loss_ce: 0.0553, aux.acc_seg: 94.0721, loss: 0.1467, grad_norm: 1.2971
2023-02-19 16:05:20,590 - mmseg - INFO - Iter [151450/160000]	lr: 3.207e-06, eta: 0:40:43, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0952, decode.acc_seg: 95.6824, aux.loss_ce: 0.0548, aux.acc_seg: 93.8561, loss: 0.1501, grad_norm: 1.2893
2023-02-19 16:05:34,225 - mmseg - INFO - Iter [151500/160000]	lr: 3.188e-06, eta: 0:40:29, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0901, decode.acc_seg: 95.9800, aux.loss_ce: 0.0537, aux.acc_seg: 94.0969, loss: 0.1438, grad_norm: 1.5023
2023-02-19 16:05:47,901 - mmseg - INFO - Iter [151550/160000]	lr: 3.169e-06, eta: 0:40:15, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0919, decode.acc_seg: 95.8384, aux.loss_ce: 0.0553, aux.acc_seg: 93.9243, loss: 0.1472, grad_norm: 1.2369
2023-02-19 16:06:03,685 - mmseg - INFO - Iter [151600/160000]	lr: 3.150e-06, eta: 0:40:01, time: 0.316, data_time: 0.046, memory: 15214, decode.loss_ce: 0.0925, decode.acc_seg: 95.8343, aux.loss_ce: 0.0545, aux.acc_seg: 94.0274, loss: 0.1470, grad_norm: 1.3983
2023-02-19 16:06:17,327 - mmseg - INFO - Iter [151650/160000]	lr: 3.132e-06, eta: 0:39:46, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0934, decode.acc_seg: 95.7554, aux.loss_ce: 0.0566, aux.acc_seg: 93.8777, loss: 0.1500, grad_norm: 1.2425
2023-02-19 16:06:31,388 - mmseg - INFO - Iter [151700/160000]	lr: 3.113e-06, eta: 0:39:32, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0884, decode.acc_seg: 96.0065, aux.loss_ce: 0.0525, aux.acc_seg: 94.2798, loss: 0.1409, grad_norm: 1.1757
2023-02-19 16:06:45,288 - mmseg - INFO - Iter [151750/160000]	lr: 3.094e-06, eta: 0:39:18, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0896, decode.acc_seg: 95.8154, aux.loss_ce: 0.0532, aux.acc_seg: 94.0030, loss: 0.1427, grad_norm: 1.4025
2023-02-19 16:06:58,897 - mmseg - INFO - Iter [151800/160000]	lr: 3.075e-06, eta: 0:39:03, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0964, decode.acc_seg: 95.8141, aux.loss_ce: 0.0564, aux.acc_seg: 93.9593, loss: 0.1528, grad_norm: 1.6623
2023-02-19 16:07:13,249 - mmseg - INFO - Iter [151850/160000]	lr: 3.057e-06, eta: 0:38:49, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0936, decode.acc_seg: 95.8407, aux.loss_ce: 0.0567, aux.acc_seg: 93.9820, loss: 0.1503, grad_norm: 1.2036
2023-02-19 16:07:26,974 - mmseg - INFO - Iter [151900/160000]	lr: 3.038e-06, eta: 0:38:35, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0878, decode.acc_seg: 95.9845, aux.loss_ce: 0.0526, aux.acc_seg: 94.2064, loss: 0.1403, grad_norm: 1.0400
2023-02-19 16:07:40,727 - mmseg - INFO - Iter [151950/160000]	lr: 3.019e-06, eta: 0:38:20, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0981, decode.acc_seg: 95.6612, aux.loss_ce: 0.0587, aux.acc_seg: 93.6172, loss: 0.1568, grad_norm: 1.3188
2023-02-19 16:07:54,834 - mmseg - INFO - Saving checkpoint at 152000 iterations
2023-02-19 16:07:58,057 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:07:58,057 - mmseg - INFO - Iter [152000/160000]	lr: 3.000e-06, eta: 0:38:06, time: 0.347, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1012, decode.acc_seg: 95.5451, aux.loss_ce: 0.0593, aux.acc_seg: 93.6726, loss: 0.1605, grad_norm: 1.4691
2023-02-19 16:08:11,609 - mmseg - INFO - Iter [152050/160000]	lr: 2.982e-06, eta: 0:37:52, time: 0.271, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0856, decode.acc_seg: 96.1772, aux.loss_ce: 0.0535, aux.acc_seg: 94.1814, loss: 0.1392, grad_norm: 1.1843
2023-02-19 16:08:25,377 - mmseg - INFO - Iter [152100/160000]	lr: 2.963e-06, eta: 0:37:38, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0961, decode.acc_seg: 95.7338, aux.loss_ce: 0.0572, aux.acc_seg: 93.8456, loss: 0.1533, grad_norm: 1.2330
2023-02-19 16:08:39,186 - mmseg - INFO - Iter [152150/160000]	lr: 2.944e-06, eta: 0:37:23, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0991, decode.acc_seg: 95.7306, aux.loss_ce: 0.0579, aux.acc_seg: 93.8010, loss: 0.1570, grad_norm: 1.9263
2023-02-19 16:08:52,954 - mmseg - INFO - Iter [152200/160000]	lr: 2.925e-06, eta: 0:37:09, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0912, decode.acc_seg: 95.8023, aux.loss_ce: 0.0538, aux.acc_seg: 94.0613, loss: 0.1450, grad_norm: 1.4357
2023-02-19 16:09:07,103 - mmseg - INFO - Iter [152250/160000]	lr: 2.907e-06, eta: 0:36:55, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0879, decode.acc_seg: 95.9914, aux.loss_ce: 0.0527, aux.acc_seg: 94.1569, loss: 0.1407, grad_norm: 1.1992
2023-02-19 16:09:20,990 - mmseg - INFO - Iter [152300/160000]	lr: 2.888e-06, eta: 0:36:40, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.7055, aux.loss_ce: 0.0569, aux.acc_seg: 93.6895, loss: 0.1510, grad_norm: 1.4420
2023-02-19 16:09:34,815 - mmseg - INFO - Iter [152350/160000]	lr: 2.869e-06, eta: 0:36:26, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0903, decode.acc_seg: 95.8718, aux.loss_ce: 0.0550, aux.acc_seg: 93.8818, loss: 0.1453, grad_norm: 0.9399
2023-02-19 16:09:48,672 - mmseg - INFO - Iter [152400/160000]	lr: 2.850e-06, eta: 0:36:12, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0952, decode.acc_seg: 95.7358, aux.loss_ce: 0.0568, aux.acc_seg: 93.8175, loss: 0.1519, grad_norm: 1.3632
2023-02-19 16:10:02,283 - mmseg - INFO - Iter [152450/160000]	lr: 2.832e-06, eta: 0:35:57, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0971, decode.acc_seg: 95.7867, aux.loss_ce: 0.0575, aux.acc_seg: 93.8707, loss: 0.1546, grad_norm: 1.4493
2023-02-19 16:10:16,828 - mmseg - INFO - Iter [152500/160000]	lr: 2.813e-06, eta: 0:35:43, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0965, decode.acc_seg: 95.6858, aux.loss_ce: 0.0582, aux.acc_seg: 93.6907, loss: 0.1547, grad_norm: 1.2977
2023-02-19 16:10:31,120 - mmseg - INFO - Iter [152550/160000]	lr: 2.794e-06, eta: 0:35:29, time: 0.286, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0901, decode.acc_seg: 95.9054, aux.loss_ce: 0.0532, aux.acc_seg: 94.1520, loss: 0.1433, grad_norm: 1.1993
2023-02-19 16:10:44,694 - mmseg - INFO - Iter [152600/160000]	lr: 2.775e-06, eta: 0:35:14, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0912, decode.acc_seg: 95.9009, aux.loss_ce: 0.0550, aux.acc_seg: 94.0688, loss: 0.1461, grad_norm: 1.2465
2023-02-19 16:10:58,658 - mmseg - INFO - Iter [152650/160000]	lr: 2.757e-06, eta: 0:35:00, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0946, decode.acc_seg: 95.7652, aux.loss_ce: 0.0567, aux.acc_seg: 93.9122, loss: 0.1513, grad_norm: 1.3709
2023-02-19 16:11:12,745 - mmseg - INFO - Iter [152700/160000]	lr: 2.738e-06, eta: 0:34:46, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0960, decode.acc_seg: 95.5540, aux.loss_ce: 0.0579, aux.acc_seg: 93.5158, loss: 0.1539, grad_norm: 1.3860
2023-02-19 16:11:26,763 - mmseg - INFO - Iter [152750/160000]	lr: 2.719e-06, eta: 0:34:32, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0911, decode.acc_seg: 95.9284, aux.loss_ce: 0.0560, aux.acc_seg: 93.8674, loss: 0.1471, grad_norm: 1.1700
2023-02-19 16:11:41,015 - mmseg - INFO - Iter [152800/160000]	lr: 2.700e-06, eta: 0:34:17, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0939, decode.acc_seg: 95.8682, aux.loss_ce: 0.0559, aux.acc_seg: 94.0371, loss: 0.1498, grad_norm: 1.6753
2023-02-19 16:11:56,998 - mmseg - INFO - Iter [152850/160000]	lr: 2.682e-06, eta: 0:34:03, time: 0.320, data_time: 0.047, memory: 15214, decode.loss_ce: 0.1005, decode.acc_seg: 95.7430, aux.loss_ce: 0.0586, aux.acc_seg: 93.8424, loss: 0.1591, grad_norm: 1.6609
2023-02-19 16:12:10,984 - mmseg - INFO - Iter [152900/160000]	lr: 2.663e-06, eta: 0:33:49, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0893, decode.acc_seg: 95.9701, aux.loss_ce: 0.0555, aux.acc_seg: 94.0484, loss: 0.1448, grad_norm: 1.3922
2023-02-19 16:12:24,715 - mmseg - INFO - Iter [152950/160000]	lr: 2.644e-06, eta: 0:33:34, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0917, decode.acc_seg: 95.9434, aux.loss_ce: 0.0542, aux.acc_seg: 94.0702, loss: 0.1459, grad_norm: 1.1160
2023-02-19 16:12:38,631 - mmseg - INFO - Saving checkpoint at 153000 iterations
2023-02-19 16:12:41,938 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:12:41,938 - mmseg - INFO - Iter [153000/160000]	lr: 2.625e-06, eta: 0:33:20, time: 0.345, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0970, decode.acc_seg: 95.6750, aux.loss_ce: 0.0576, aux.acc_seg: 93.6999, loss: 0.1545, grad_norm: 1.6110
2023-02-19 16:12:55,723 - mmseg - INFO - Iter [153050/160000]	lr: 2.607e-06, eta: 0:33:06, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0943, decode.acc_seg: 95.8034, aux.loss_ce: 0.0559, aux.acc_seg: 93.9644, loss: 0.1502, grad_norm: 1.7345
2023-02-19 16:13:09,587 - mmseg - INFO - Iter [153100/160000]	lr: 2.588e-06, eta: 0:32:52, time: 0.277, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0918, decode.acc_seg: 95.8764, aux.loss_ce: 0.0548, aux.acc_seg: 94.0140, loss: 0.1466, grad_norm: 1.0838
2023-02-19 16:13:23,418 - mmseg - INFO - Iter [153150/160000]	lr: 2.569e-06, eta: 0:32:37, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0880, decode.acc_seg: 95.9771, aux.loss_ce: 0.0517, aux.acc_seg: 94.3500, loss: 0.1397, grad_norm: 1.3999
2023-02-19 16:13:37,264 - mmseg - INFO - Iter [153200/160000]	lr: 2.550e-06, eta: 0:32:23, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0895, decode.acc_seg: 95.9630, aux.loss_ce: 0.0534, aux.acc_seg: 94.1407, loss: 0.1430, grad_norm: 1.1495
2023-02-19 16:13:51,290 - mmseg - INFO - Iter [153250/160000]	lr: 2.532e-06, eta: 0:32:09, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0922, decode.acc_seg: 95.8408, aux.loss_ce: 0.0549, aux.acc_seg: 94.0852, loss: 0.1471, grad_norm: 1.1719
2023-02-19 16:14:06,501 - mmseg - INFO - Iter [153300/160000]	lr: 2.513e-06, eta: 0:31:54, time: 0.304, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0918, decode.acc_seg: 95.8480, aux.loss_ce: 0.0553, aux.acc_seg: 94.0000, loss: 0.1471, grad_norm: 1.8219
2023-02-19 16:14:20,235 - mmseg - INFO - Iter [153350/160000]	lr: 2.494e-06, eta: 0:31:40, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0948, decode.acc_seg: 95.7627, aux.loss_ce: 0.0561, aux.acc_seg: 93.9167, loss: 0.1509, grad_norm: 1.4999
2023-02-19 16:14:34,008 - mmseg - INFO - Iter [153400/160000]	lr: 2.475e-06, eta: 0:31:26, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0903, decode.acc_seg: 95.8721, aux.loss_ce: 0.0541, aux.acc_seg: 93.9688, loss: 0.1444, grad_norm: 1.0637
2023-02-19 16:14:47,776 - mmseg - INFO - Iter [153450/160000]	lr: 2.457e-06, eta: 0:31:12, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0947, decode.acc_seg: 95.7540, aux.loss_ce: 0.0569, aux.acc_seg: 93.8163, loss: 0.1516, grad_norm: 1.0524
2023-02-19 16:15:02,398 - mmseg - INFO - Iter [153500/160000]	lr: 2.438e-06, eta: 0:30:57, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0868, decode.acc_seg: 96.0973, aux.loss_ce: 0.0526, aux.acc_seg: 94.2828, loss: 0.1394, grad_norm: 0.9449
2023-02-19 16:15:16,485 - mmseg - INFO - Iter [153550/160000]	lr: 2.419e-06, eta: 0:30:43, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0929, decode.acc_seg: 95.8942, aux.loss_ce: 0.0565, aux.acc_seg: 93.9524, loss: 0.1494, grad_norm: 1.4183
2023-02-19 16:15:30,493 - mmseg - INFO - Iter [153600/160000]	lr: 2.400e-06, eta: 0:30:29, time: 0.281, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0894, decode.acc_seg: 95.9830, aux.loss_ce: 0.0554, aux.acc_seg: 94.0133, loss: 0.1447, grad_norm: 0.9898
2023-02-19 16:15:44,690 - mmseg - INFO - Iter [153650/160000]	lr: 2.382e-06, eta: 0:30:14, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0899, decode.acc_seg: 95.8348, aux.loss_ce: 0.0536, aux.acc_seg: 94.0515, loss: 0.1434, grad_norm: 1.0712
2023-02-19 16:15:59,174 - mmseg - INFO - Iter [153700/160000]	lr: 2.363e-06, eta: 0:30:00, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0931, decode.acc_seg: 95.8757, aux.loss_ce: 0.0551, aux.acc_seg: 94.0626, loss: 0.1482, grad_norm: 1.2715
2023-02-19 16:16:13,089 - mmseg - INFO - Iter [153750/160000]	lr: 2.344e-06, eta: 0:29:46, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0926, decode.acc_seg: 95.8315, aux.loss_ce: 0.0558, aux.acc_seg: 93.8677, loss: 0.1483, grad_norm: 1.1749
2023-02-19 16:16:27,172 - mmseg - INFO - Iter [153800/160000]	lr: 2.325e-06, eta: 0:29:31, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.8717, aux.loss_ce: 0.0552, aux.acc_seg: 94.0463, loss: 0.1484, grad_norm: 1.1438
2023-02-19 16:16:40,939 - mmseg - INFO - Iter [153850/160000]	lr: 2.307e-06, eta: 0:29:17, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0895, decode.acc_seg: 95.9264, aux.loss_ce: 0.0533, aux.acc_seg: 94.1569, loss: 0.1428, grad_norm: 1.2791
2023-02-19 16:16:54,640 - mmseg - INFO - Iter [153900/160000]	lr: 2.288e-06, eta: 0:29:03, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0896, decode.acc_seg: 95.9254, aux.loss_ce: 0.0543, aux.acc_seg: 94.0739, loss: 0.1439, grad_norm: 1.4482
2023-02-19 16:17:08,409 - mmseg - INFO - Iter [153950/160000]	lr: 2.269e-06, eta: 0:28:49, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0898, decode.acc_seg: 95.9713, aux.loss_ce: 0.0536, aux.acc_seg: 94.1530, loss: 0.1434, grad_norm: 1.1730
2023-02-19 16:17:22,255 - mmseg - INFO - Saving checkpoint at 154000 iterations
2023-02-19 16:17:25,533 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:17:25,533 - mmseg - INFO - Iter [154000/160000]	lr: 2.250e-06, eta: 0:28:34, time: 0.343, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0887, decode.acc_seg: 95.8766, aux.loss_ce: 0.0532, aux.acc_seg: 93.9975, loss: 0.1419, grad_norm: 1.2090
2023-02-19 16:17:39,200 - mmseg - INFO - Iter [154050/160000]	lr: 2.232e-06, eta: 0:28:20, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0874, decode.acc_seg: 96.1019, aux.loss_ce: 0.0528, aux.acc_seg: 94.2999, loss: 0.1402, grad_norm: 1.0656
2023-02-19 16:17:55,012 - mmseg - INFO - Iter [154100/160000]	lr: 2.213e-06, eta: 0:28:06, time: 0.316, data_time: 0.049, memory: 15214, decode.loss_ce: 0.0933, decode.acc_seg: 95.8382, aux.loss_ce: 0.0557, aux.acc_seg: 93.9610, loss: 0.1489, grad_norm: 1.5295
2023-02-19 16:18:09,222 - mmseg - INFO - Iter [154150/160000]	lr: 2.194e-06, eta: 0:27:52, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0929, decode.acc_seg: 95.8049, aux.loss_ce: 0.0550, aux.acc_seg: 93.9673, loss: 0.1480, grad_norm: 1.2165
2023-02-19 16:18:24,122 - mmseg - INFO - Iter [154200/160000]	lr: 2.175e-06, eta: 0:27:37, time: 0.297, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0915, decode.acc_seg: 95.8668, aux.loss_ce: 0.0530, aux.acc_seg: 94.1803, loss: 0.1445, grad_norm: 1.3729
2023-02-19 16:18:38,685 - mmseg - INFO - Iter [154250/160000]	lr: 2.157e-06, eta: 0:27:23, time: 0.292, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0926, decode.acc_seg: 95.8146, aux.loss_ce: 0.0555, aux.acc_seg: 93.8957, loss: 0.1480, grad_norm: 1.4920
2023-02-19 16:18:53,650 - mmseg - INFO - Iter [154300/160000]	lr: 2.138e-06, eta: 0:27:09, time: 0.299, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0896, decode.acc_seg: 95.9654, aux.loss_ce: 0.0548, aux.acc_seg: 94.0125, loss: 0.1443, grad_norm: 1.0183
2023-02-19 16:19:07,200 - mmseg - INFO - Iter [154350/160000]	lr: 2.119e-06, eta: 0:26:54, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0900, decode.acc_seg: 96.0125, aux.loss_ce: 0.0553, aux.acc_seg: 93.9285, loss: 0.1453, grad_norm: 1.1514
2023-02-19 16:19:21,115 - mmseg - INFO - Iter [154400/160000]	lr: 2.100e-06, eta: 0:26:40, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0917, decode.acc_seg: 95.9201, aux.loss_ce: 0.0536, aux.acc_seg: 94.2113, loss: 0.1453, grad_norm: 1.1043
2023-02-19 16:19:34,811 - mmseg - INFO - Iter [154450/160000]	lr: 2.082e-06, eta: 0:26:26, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0930, decode.acc_seg: 95.8044, aux.loss_ce: 0.0553, aux.acc_seg: 93.9382, loss: 0.1484, grad_norm: 1.0931
2023-02-19 16:19:48,877 - mmseg - INFO - Iter [154500/160000]	lr: 2.063e-06, eta: 0:26:11, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0957, decode.acc_seg: 95.7315, aux.loss_ce: 0.0556, aux.acc_seg: 93.8779, loss: 0.1513, grad_norm: 1.4075
2023-02-19 16:20:02,500 - mmseg - INFO - Iter [154550/160000]	lr: 2.044e-06, eta: 0:25:57, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0909, decode.acc_seg: 95.9338, aux.loss_ce: 0.0559, aux.acc_seg: 93.9925, loss: 0.1468, grad_norm: 1.4987
2023-02-19 16:20:16,163 - mmseg - INFO - Iter [154600/160000]	lr: 2.025e-06, eta: 0:25:43, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0877, decode.acc_seg: 96.0190, aux.loss_ce: 0.0546, aux.acc_seg: 93.9713, loss: 0.1423, grad_norm: 1.0738
2023-02-19 16:20:30,269 - mmseg - INFO - Iter [154650/160000]	lr: 2.007e-06, eta: 0:25:29, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0973, decode.acc_seg: 95.5718, aux.loss_ce: 0.0582, aux.acc_seg: 93.5770, loss: 0.1555, grad_norm: 1.4269
2023-02-19 16:20:44,362 - mmseg - INFO - Iter [154700/160000]	lr: 1.988e-06, eta: 0:25:14, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0916, decode.acc_seg: 95.9017, aux.loss_ce: 0.0554, aux.acc_seg: 93.9459, loss: 0.1470, grad_norm: 1.0903
2023-02-19 16:20:59,474 - mmseg - INFO - Iter [154750/160000]	lr: 1.969e-06, eta: 0:25:00, time: 0.302, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0914, decode.acc_seg: 95.8606, aux.loss_ce: 0.0528, aux.acc_seg: 94.1475, loss: 0.1443, grad_norm: 1.1419
2023-02-19 16:21:13,655 - mmseg - INFO - Iter [154800/160000]	lr: 1.950e-06, eta: 0:24:46, time: 0.284, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0952, decode.acc_seg: 95.7314, aux.loss_ce: 0.0564, aux.acc_seg: 93.8333, loss: 0.1516, grad_norm: 1.2948
2023-02-19 16:21:27,544 - mmseg - INFO - Iter [154850/160000]	lr: 1.932e-06, eta: 0:24:31, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0899, decode.acc_seg: 95.9538, aux.loss_ce: 0.0550, aux.acc_seg: 94.0424, loss: 0.1449, grad_norm: 1.2415
2023-02-19 16:21:41,321 - mmseg - INFO - Iter [154900/160000]	lr: 1.913e-06, eta: 0:24:17, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0903, decode.acc_seg: 95.9689, aux.loss_ce: 0.0548, aux.acc_seg: 94.0671, loss: 0.1452, grad_norm: 1.1328
2023-02-19 16:21:54,921 - mmseg - INFO - Iter [154950/160000]	lr: 1.894e-06, eta: 0:24:03, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0973, decode.acc_seg: 95.5717, aux.loss_ce: 0.0576, aux.acc_seg: 93.6557, loss: 0.1549, grad_norm: 1.3733
2023-02-19 16:22:08,994 - mmseg - INFO - Saving checkpoint at 155000 iterations
2023-02-19 16:22:12,225 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:22:12,225 - mmseg - INFO - Iter [155000/160000]	lr: 1.875e-06, eta: 0:23:49, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.8686, aux.loss_ce: 0.0567, aux.acc_seg: 93.8025, loss: 0.1500, grad_norm: 1.8526
2023-02-19 16:22:26,004 - mmseg - INFO - Iter [155050/160000]	lr: 1.857e-06, eta: 0:23:34, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0931, decode.acc_seg: 95.6374, aux.loss_ce: 0.0550, aux.acc_seg: 93.7586, loss: 0.1481, grad_norm: 1.1546
2023-02-19 16:22:40,048 - mmseg - INFO - Iter [155100/160000]	lr: 1.838e-06, eta: 0:23:20, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0922, decode.acc_seg: 95.7818, aux.loss_ce: 0.0543, aux.acc_seg: 93.9563, loss: 0.1465, grad_norm: 1.3489
2023-02-19 16:22:53,854 - mmseg - INFO - Iter [155150/160000]	lr: 1.819e-06, eta: 0:23:06, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0923, decode.acc_seg: 95.9125, aux.loss_ce: 0.0540, aux.acc_seg: 94.1786, loss: 0.1463, grad_norm: 1.0617
2023-02-19 16:23:07,742 - mmseg - INFO - Iter [155200/160000]	lr: 1.800e-06, eta: 0:22:51, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0875, decode.acc_seg: 96.0259, aux.loss_ce: 0.0533, aux.acc_seg: 94.1601, loss: 0.1408, grad_norm: 0.9894
2023-02-19 16:23:21,741 - mmseg - INFO - Iter [155250/160000]	lr: 1.782e-06, eta: 0:22:37, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0916, decode.acc_seg: 95.8718, aux.loss_ce: 0.0545, aux.acc_seg: 94.0243, loss: 0.1462, grad_norm: 1.1346
2023-02-19 16:23:35,415 - mmseg - INFO - Iter [155300/160000]	lr: 1.763e-06, eta: 0:22:23, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0930, decode.acc_seg: 95.8635, aux.loss_ce: 0.0550, aux.acc_seg: 94.0861, loss: 0.1480, grad_norm: 1.3662
2023-02-19 16:23:51,248 - mmseg - INFO - Iter [155350/160000]	lr: 1.744e-06, eta: 0:22:09, time: 0.317, data_time: 0.048, memory: 15214, decode.loss_ce: 0.0925, decode.acc_seg: 95.8977, aux.loss_ce: 0.0558, aux.acc_seg: 93.9489, loss: 0.1483, grad_norm: 2.2177
2023-02-19 16:24:04,970 - mmseg - INFO - Iter [155400/160000]	lr: 1.725e-06, eta: 0:21:54, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0924, decode.acc_seg: 95.9577, aux.loss_ce: 0.0558, aux.acc_seg: 94.0550, loss: 0.1481, grad_norm: 1.0981
2023-02-19 16:24:18,588 - mmseg - INFO - Iter [155450/160000]	lr: 1.707e-06, eta: 0:21:40, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0861, decode.acc_seg: 96.0308, aux.loss_ce: 0.0522, aux.acc_seg: 94.1729, loss: 0.1383, grad_norm: 1.0648
2023-02-19 16:24:32,328 - mmseg - INFO - Iter [155500/160000]	lr: 1.688e-06, eta: 0:21:26, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0898, decode.acc_seg: 95.9936, aux.loss_ce: 0.0537, aux.acc_seg: 94.1190, loss: 0.1435, grad_norm: 1.1967
2023-02-19 16:24:46,914 - mmseg - INFO - Iter [155550/160000]	lr: 1.669e-06, eta: 0:21:11, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0950, decode.acc_seg: 95.7881, aux.loss_ce: 0.0559, aux.acc_seg: 93.8692, loss: 0.1509, grad_norm: 1.7507
2023-02-19 16:25:00,602 - mmseg - INFO - Iter [155600/160000]	lr: 1.650e-06, eta: 0:20:57, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0900, decode.acc_seg: 95.9074, aux.loss_ce: 0.0539, aux.acc_seg: 94.0134, loss: 0.1439, grad_norm: 1.0435
2023-02-19 16:25:15,172 - mmseg - INFO - Iter [155650/160000]	lr: 1.632e-06, eta: 0:20:43, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0898, decode.acc_seg: 96.0008, aux.loss_ce: 0.0536, aux.acc_seg: 94.1707, loss: 0.1434, grad_norm: 1.2395
2023-02-19 16:25:28,822 - mmseg - INFO - Iter [155700/160000]	lr: 1.613e-06, eta: 0:20:28, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0896, decode.acc_seg: 95.9477, aux.loss_ce: 0.0525, aux.acc_seg: 94.2536, loss: 0.1421, grad_norm: 1.0416
2023-02-19 16:25:42,601 - mmseg - INFO - Iter [155750/160000]	lr: 1.594e-06, eta: 0:20:14, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0878, decode.acc_seg: 96.0661, aux.loss_ce: 0.0541, aux.acc_seg: 94.1469, loss: 0.1419, grad_norm: 1.0858
2023-02-19 16:25:56,825 - mmseg - INFO - Iter [155800/160000]	lr: 1.575e-06, eta: 0:20:00, time: 0.284, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0963, decode.acc_seg: 95.8377, aux.loss_ce: 0.0565, aux.acc_seg: 94.0243, loss: 0.1528, grad_norm: 1.4528
2023-02-19 16:26:11,117 - mmseg - INFO - Iter [155850/160000]	lr: 1.557e-06, eta: 0:19:46, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0931, decode.acc_seg: 95.7490, aux.loss_ce: 0.0552, aux.acc_seg: 93.8388, loss: 0.1483, grad_norm: 1.2353
2023-02-19 16:26:25,370 - mmseg - INFO - Iter [155900/160000]	lr: 1.538e-06, eta: 0:19:31, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0926, decode.acc_seg: 95.8841, aux.loss_ce: 0.0557, aux.acc_seg: 93.9804, loss: 0.1483, grad_norm: 1.0638
2023-02-19 16:26:39,125 - mmseg - INFO - Iter [155950/160000]	lr: 1.519e-06, eta: 0:19:17, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0857, decode.acc_seg: 96.0959, aux.loss_ce: 0.0531, aux.acc_seg: 94.1319, loss: 0.1387, grad_norm: 1.1925
2023-02-19 16:26:52,935 - mmseg - INFO - Saving checkpoint at 156000 iterations
2023-02-19 16:26:56,176 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:26:56,176 - mmseg - INFO - Iter [156000/160000]	lr: 1.500e-06, eta: 0:19:03, time: 0.341, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0921, decode.acc_seg: 95.8113, aux.loss_ce: 0.0564, aux.acc_seg: 93.7572, loss: 0.1485, grad_norm: 1.2251
2023-02-19 16:27:10,108 - mmseg - INFO - Iter [156050/160000]	lr: 1.482e-06, eta: 0:18:48, time: 0.279, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0921, decode.acc_seg: 95.8141, aux.loss_ce: 0.0560, aux.acc_seg: 93.8541, loss: 0.1482, grad_norm: 1.0781
2023-02-19 16:27:23,837 - mmseg - INFO - Iter [156100/160000]	lr: 1.463e-06, eta: 0:18:34, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0917, decode.acc_seg: 95.8283, aux.loss_ce: 0.0565, aux.acc_seg: 93.7543, loss: 0.1482, grad_norm: 1.4298
2023-02-19 16:27:37,908 - mmseg - INFO - Iter [156150/160000]	lr: 1.444e-06, eta: 0:18:20, time: 0.282, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0905, decode.acc_seg: 95.8487, aux.loss_ce: 0.0545, aux.acc_seg: 94.0098, loss: 0.1450, grad_norm: 1.0362
2023-02-19 16:27:51,683 - mmseg - INFO - Iter [156200/160000]	lr: 1.425e-06, eta: 0:18:06, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0920, decode.acc_seg: 95.8502, aux.loss_ce: 0.0556, aux.acc_seg: 93.8932, loss: 0.1476, grad_norm: 1.2917
2023-02-19 16:28:05,344 - mmseg - INFO - Iter [156250/160000]	lr: 1.407e-06, eta: 0:17:51, time: 0.273, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0957, decode.acc_seg: 95.7586, aux.loss_ce: 0.0562, aux.acc_seg: 93.9408, loss: 0.1519, grad_norm: 1.7952
2023-02-19 16:28:19,680 - mmseg - INFO - Iter [156300/160000]	lr: 1.388e-06, eta: 0:17:37, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0901, decode.acc_seg: 95.8050, aux.loss_ce: 0.0549, aux.acc_seg: 93.8103, loss: 0.1450, grad_norm: 1.2193
2023-02-19 16:28:33,535 - mmseg - INFO - Iter [156350/160000]	lr: 1.369e-06, eta: 0:17:23, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0910, decode.acc_seg: 95.9244, aux.loss_ce: 0.0573, aux.acc_seg: 93.9466, loss: 0.1483, grad_norm: 1.9257
2023-02-19 16:28:47,272 - mmseg - INFO - Iter [156400/160000]	lr: 1.350e-06, eta: 0:17:08, time: 0.275, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0902, decode.acc_seg: 95.8489, aux.loss_ce: 0.0536, aux.acc_seg: 93.9503, loss: 0.1439, grad_norm: 1.1879
2023-02-19 16:29:01,112 - mmseg - INFO - Iter [156450/160000]	lr: 1.332e-06, eta: 0:16:54, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0913, decode.acc_seg: 95.9596, aux.loss_ce: 0.0548, aux.acc_seg: 94.1508, loss: 0.1461, grad_norm: 1.1015
2023-02-19 16:29:15,193 - mmseg - INFO - Iter [156500/160000]	lr: 1.313e-06, eta: 0:16:40, time: 0.282, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0926, decode.acc_seg: 95.9675, aux.loss_ce: 0.0568, aux.acc_seg: 93.9776, loss: 0.1494, grad_norm: 1.3219
2023-02-19 16:29:28,808 - mmseg - INFO - Iter [156550/160000]	lr: 1.294e-06, eta: 0:16:25, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0938, decode.acc_seg: 95.8200, aux.loss_ce: 0.0560, aux.acc_seg: 94.0173, loss: 0.1498, grad_norm: 1.4517
2023-02-19 16:29:42,671 - mmseg - INFO - Iter [156600/160000]	lr: 1.275e-06, eta: 0:16:11, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0914, decode.acc_seg: 95.9261, aux.loss_ce: 0.0551, aux.acc_seg: 94.0214, loss: 0.1465, grad_norm: 1.0849
2023-02-19 16:29:59,232 - mmseg - INFO - Iter [156650/160000]	lr: 1.257e-06, eta: 0:15:57, time: 0.332, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0915, decode.acc_seg: 95.8117, aux.loss_ce: 0.0564, aux.acc_seg: 93.8508, loss: 0.1478, grad_norm: 1.3742
2023-02-19 16:30:12,803 - mmseg - INFO - Iter [156700/160000]	lr: 1.238e-06, eta: 0:15:43, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0907, decode.acc_seg: 95.9446, aux.loss_ce: 0.0561, aux.acc_seg: 93.8363, loss: 0.1468, grad_norm: 1.3294
2023-02-19 16:30:26,509 - mmseg - INFO - Iter [156750/160000]	lr: 1.219e-06, eta: 0:15:28, time: 0.274, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0921, decode.acc_seg: 95.8488, aux.loss_ce: 0.0567, aux.acc_seg: 93.8224, loss: 0.1488, grad_norm: 1.2297
2023-02-19 16:30:40,489 - mmseg - INFO - Iter [156800/160000]	lr: 1.200e-06, eta: 0:15:14, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0844, decode.acc_seg: 96.2226, aux.loss_ce: 0.0517, aux.acc_seg: 94.3670, loss: 0.1361, grad_norm: 1.2398
2023-02-19 16:30:54,285 - mmseg - INFO - Iter [156850/160000]	lr: 1.182e-06, eta: 0:15:00, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0910, decode.acc_seg: 95.8829, aux.loss_ce: 0.0560, aux.acc_seg: 93.8011, loss: 0.1470, grad_norm: 1.2304
2023-02-19 16:31:08,085 - mmseg - INFO - Iter [156900/160000]	lr: 1.163e-06, eta: 0:14:45, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0881, decode.acc_seg: 96.0630, aux.loss_ce: 0.0535, aux.acc_seg: 94.1957, loss: 0.1416, grad_norm: 1.5462
2023-02-19 16:31:22,001 - mmseg - INFO - Iter [156950/160000]	lr: 1.144e-06, eta: 0:14:31, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0905, decode.acc_seg: 95.8669, aux.loss_ce: 0.0552, aux.acc_seg: 93.8757, loss: 0.1457, grad_norm: 1.3035
2023-02-19 16:31:36,446 - mmseg - INFO - Saving checkpoint at 157000 iterations
2023-02-19 16:31:39,678 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:31:39,678 - mmseg - INFO - Iter [157000/160000]	lr: 1.125e-06, eta: 0:14:17, time: 0.354, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0879, decode.acc_seg: 96.0121, aux.loss_ce: 0.0526, aux.acc_seg: 94.1691, loss: 0.1405, grad_norm: 1.1992
2023-02-19 16:31:53,322 - mmseg - INFO - Iter [157050/160000]	lr: 1.107e-06, eta: 0:14:03, time: 0.272, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0892, decode.acc_seg: 96.0542, aux.loss_ce: 0.0538, aux.acc_seg: 94.2162, loss: 0.1430, grad_norm: 1.0995
2023-02-19 16:32:07,302 - mmseg - INFO - Iter [157100/160000]	lr: 1.088e-06, eta: 0:13:48, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0955, decode.acc_seg: 95.6420, aux.loss_ce: 0.0583, aux.acc_seg: 93.5242, loss: 0.1538, grad_norm: 1.1698
2023-02-19 16:32:21,863 - mmseg - INFO - Iter [157150/160000]	lr: 1.069e-06, eta: 0:13:34, time: 0.291, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0966, decode.acc_seg: 95.6307, aux.loss_ce: 0.0563, aux.acc_seg: 93.8272, loss: 0.1529, grad_norm: 1.4528
2023-02-19 16:32:35,767 - mmseg - INFO - Iter [157200/160000]	lr: 1.050e-06, eta: 0:13:20, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0915, decode.acc_seg: 95.9003, aux.loss_ce: 0.0563, aux.acc_seg: 93.8862, loss: 0.1478, grad_norm: 1.4862
2023-02-19 16:32:49,530 - mmseg - INFO - Iter [157250/160000]	lr: 1.032e-06, eta: 0:13:05, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0923, decode.acc_seg: 95.8697, aux.loss_ce: 0.0564, aux.acc_seg: 93.8720, loss: 0.1487, grad_norm: 1.7702
2023-02-19 16:33:03,253 - mmseg - INFO - Iter [157300/160000]	lr: 1.013e-06, eta: 0:12:51, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0871, decode.acc_seg: 96.0509, aux.loss_ce: 0.0509, aux.acc_seg: 94.3318, loss: 0.1381, grad_norm: 1.2864
2023-02-19 16:33:17,565 - mmseg - INFO - Iter [157350/160000]	lr: 9.941e-07, eta: 0:12:37, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0925, decode.acc_seg: 95.9957, aux.loss_ce: 0.0547, aux.acc_seg: 94.2373, loss: 0.1472, grad_norm: 1.2783
2023-02-19 16:33:31,487 - mmseg - INFO - Iter [157400/160000]	lr: 9.754e-07, eta: 0:12:23, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0889, decode.acc_seg: 96.0865, aux.loss_ce: 0.0541, aux.acc_seg: 94.1746, loss: 0.1430, grad_norm: 1.0739
2023-02-19 16:33:45,602 - mmseg - INFO - Iter [157450/160000]	lr: 9.566e-07, eta: 0:12:08, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0881, decode.acc_seg: 96.0268, aux.loss_ce: 0.0538, aux.acc_seg: 94.0762, loss: 0.1419, grad_norm: 1.1089
2023-02-19 16:33:59,306 - mmseg - INFO - Iter [157500/160000]	lr: 9.379e-07, eta: 0:11:54, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0895, decode.acc_seg: 95.8612, aux.loss_ce: 0.0540, aux.acc_seg: 93.9412, loss: 0.1435, grad_norm: 1.0005
2023-02-19 16:34:13,569 - mmseg - INFO - Iter [157550/160000]	lr: 9.191e-07, eta: 0:11:40, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0930, decode.acc_seg: 95.8440, aux.loss_ce: 0.0543, aux.acc_seg: 94.0283, loss: 0.1473, grad_norm: 1.5918
2023-02-19 16:34:27,806 - mmseg - INFO - Iter [157600/160000]	lr: 9.004e-07, eta: 0:11:25, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0901, decode.acc_seg: 95.9095, aux.loss_ce: 0.0543, aux.acc_seg: 94.0546, loss: 0.1444, grad_norm: 1.2358
2023-02-19 16:34:41,707 - mmseg - INFO - Iter [157650/160000]	lr: 8.816e-07, eta: 0:11:11, time: 0.278, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0910, decode.acc_seg: 95.8273, aux.loss_ce: 0.0565, aux.acc_seg: 93.7252, loss: 0.1475, grad_norm: 1.2044
2023-02-19 16:34:55,812 - mmseg - INFO - Iter [157700/160000]	lr: 8.629e-07, eta: 0:10:57, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0890, decode.acc_seg: 96.0016, aux.loss_ce: 0.0535, aux.acc_seg: 94.1762, loss: 0.1425, grad_norm: 1.5459
2023-02-19 16:35:10,745 - mmseg - INFO - Iter [157750/160000]	lr: 8.441e-07, eta: 0:10:42, time: 0.299, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.7648, aux.loss_ce: 0.0560, aux.acc_seg: 93.8779, loss: 0.1502, grad_norm: 1.1575
2023-02-19 16:35:24,763 - mmseg - INFO - Iter [157800/160000]	lr: 8.254e-07, eta: 0:10:28, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0963, decode.acc_seg: 95.7622, aux.loss_ce: 0.0563, aux.acc_seg: 93.9609, loss: 0.1526, grad_norm: 1.7994
2023-02-19 16:35:38,325 - mmseg - INFO - Iter [157850/160000]	lr: 8.066e-07, eta: 0:10:14, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0918, decode.acc_seg: 95.7508, aux.loss_ce: 0.0566, aux.acc_seg: 93.7098, loss: 0.1484, grad_norm: 1.2131
2023-02-19 16:35:54,221 - mmseg - INFO - Iter [157900/160000]	lr: 7.879e-07, eta: 0:10:00, time: 0.318, data_time: 0.048, memory: 15214, decode.loss_ce: 0.0948, decode.acc_seg: 95.7549, aux.loss_ce: 0.0553, aux.acc_seg: 93.9273, loss: 0.1501, grad_norm: 1.3544
2023-02-19 16:36:07,782 - mmseg - INFO - Iter [157950/160000]	lr: 7.691e-07, eta: 0:09:45, time: 0.271, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0912, decode.acc_seg: 95.9091, aux.loss_ce: 0.0557, aux.acc_seg: 93.9021, loss: 0.1469, grad_norm: 1.2453
2023-02-19 16:36:21,310 - mmseg - INFO - Saving checkpoint at 158000 iterations
2023-02-19 16:36:24,553 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:36:24,554 - mmseg - INFO - Iter [158000/160000]	lr: 7.504e-07, eta: 0:09:31, time: 0.336, data_time: 0.005, memory: 15214, decode.loss_ce: 0.1034, decode.acc_seg: 95.6852, aux.loss_ce: 0.0604, aux.acc_seg: 93.6021, loss: 0.1638, grad_norm: 1.7673
2023-02-19 16:36:38,978 - mmseg - INFO - Iter [158050/160000]	lr: 7.316e-07, eta: 0:09:17, time: 0.288, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0858, decode.acc_seg: 96.0209, aux.loss_ce: 0.0511, aux.acc_seg: 94.2570, loss: 0.1370, grad_norm: 1.3509
2023-02-19 16:36:52,898 - mmseg - INFO - Iter [158100/160000]	lr: 7.129e-07, eta: 0:09:02, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0895, decode.acc_seg: 95.9936, aux.loss_ce: 0.0543, aux.acc_seg: 94.1959, loss: 0.1438, grad_norm: 1.2252
2023-02-19 16:37:07,390 - mmseg - INFO - Iter [158150/160000]	lr: 6.941e-07, eta: 0:08:48, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0881, decode.acc_seg: 95.9885, aux.loss_ce: 0.0526, aux.acc_seg: 94.1178, loss: 0.1407, grad_norm: 1.1235
2023-02-19 16:37:21,214 - mmseg - INFO - Iter [158200/160000]	lr: 6.754e-07, eta: 0:08:34, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0880, decode.acc_seg: 96.0185, aux.loss_ce: 0.0525, aux.acc_seg: 94.2504, loss: 0.1405, grad_norm: 1.3493
2023-02-19 16:37:34,984 - mmseg - INFO - Iter [158250/160000]	lr: 6.566e-07, eta: 0:08:20, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0909, decode.acc_seg: 95.8637, aux.loss_ce: 0.0554, aux.acc_seg: 93.8574, loss: 0.1463, grad_norm: 1.7523
2023-02-19 16:37:49,104 - mmseg - INFO - Iter [158300/160000]	lr: 6.379e-07, eta: 0:08:05, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0896, decode.acc_seg: 95.9249, aux.loss_ce: 0.0548, aux.acc_seg: 93.9820, loss: 0.1444, grad_norm: 1.1274
2023-02-19 16:38:03,403 - mmseg - INFO - Iter [158350/160000]	lr: 6.191e-07, eta: 0:07:51, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0939, decode.acc_seg: 95.7555, aux.loss_ce: 0.0555, aux.acc_seg: 93.9862, loss: 0.1495, grad_norm: 1.2376
2023-02-19 16:38:17,227 - mmseg - INFO - Iter [158400/160000]	lr: 6.004e-07, eta: 0:07:37, time: 0.276, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0909, decode.acc_seg: 95.8738, aux.loss_ce: 0.0546, aux.acc_seg: 94.0385, loss: 0.1456, grad_norm: 1.7447
2023-02-19 16:38:30,959 - mmseg - INFO - Iter [158450/160000]	lr: 5.816e-07, eta: 0:07:22, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0883, decode.acc_seg: 96.1058, aux.loss_ce: 0.0530, aux.acc_seg: 94.2602, loss: 0.1412, grad_norm: 1.1864
2023-02-19 16:38:45,234 - mmseg - INFO - Iter [158500/160000]	lr: 5.629e-07, eta: 0:07:08, time: 0.286, data_time: 0.006, memory: 15214, decode.loss_ce: 0.0986, decode.acc_seg: 95.7107, aux.loss_ce: 0.0566, aux.acc_seg: 93.9274, loss: 0.1552, grad_norm: 1.3564
2023-02-19 16:38:58,967 - mmseg - INFO - Iter [158550/160000]	lr: 5.441e-07, eta: 0:06:54, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0888, decode.acc_seg: 95.9735, aux.loss_ce: 0.0523, aux.acc_seg: 94.1858, loss: 0.1411, grad_norm: 1.2179
2023-02-19 16:39:13,040 - mmseg - INFO - Iter [158600/160000]	lr: 5.254e-07, eta: 0:06:40, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0892, decode.acc_seg: 95.8450, aux.loss_ce: 0.0537, aux.acc_seg: 93.9658, loss: 0.1429, grad_norm: 1.1716
2023-02-19 16:39:28,267 - mmseg - INFO - Iter [158650/160000]	lr: 5.066e-07, eta: 0:06:25, time: 0.304, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0897, decode.acc_seg: 95.9240, aux.loss_ce: 0.0537, aux.acc_seg: 94.1322, loss: 0.1434, grad_norm: 1.0204
2023-02-19 16:39:42,639 - mmseg - INFO - Iter [158700/160000]	lr: 4.879e-07, eta: 0:06:11, time: 0.287, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0907, decode.acc_seg: 95.9962, aux.loss_ce: 0.0535, aux.acc_seg: 94.1361, loss: 0.1441, grad_norm: 1.4776
2023-02-19 16:39:56,504 - mmseg - INFO - Iter [158750/160000]	lr: 4.691e-07, eta: 0:05:57, time: 0.277, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0936, decode.acc_seg: 95.8535, aux.loss_ce: 0.0558, aux.acc_seg: 93.9315, loss: 0.1494, grad_norm: 1.1193
2023-02-19 16:40:10,484 - mmseg - INFO - Iter [158800/160000]	lr: 4.504e-07, eta: 0:05:42, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0868, decode.acc_seg: 96.0728, aux.loss_ce: 0.0519, aux.acc_seg: 94.3551, loss: 0.1387, grad_norm: 0.8986
2023-02-19 16:40:24,438 - mmseg - INFO - Iter [158850/160000]	lr: 4.316e-07, eta: 0:05:28, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0924, decode.acc_seg: 95.8991, aux.loss_ce: 0.0556, aux.acc_seg: 94.0090, loss: 0.1480, grad_norm: 1.1048
2023-02-19 16:40:38,127 - mmseg - INFO - Iter [158900/160000]	lr: 4.129e-07, eta: 0:05:14, time: 0.274, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0976, decode.acc_seg: 95.6917, aux.loss_ce: 0.0590, aux.acc_seg: 93.7439, loss: 0.1565, grad_norm: 1.5442
2023-02-19 16:40:52,163 - mmseg - INFO - Iter [158950/160000]	lr: 3.941e-07, eta: 0:05:00, time: 0.281, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0908, decode.acc_seg: 95.9106, aux.loss_ce: 0.0553, aux.acc_seg: 93.8974, loss: 0.1461, grad_norm: 1.1666
2023-02-19 16:41:06,012 - mmseg - INFO - Saving checkpoint at 159000 iterations
2023-02-19 16:41:09,263 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:41:09,263 - mmseg - INFO - Iter [159000/160000]	lr: 3.754e-07, eta: 0:04:45, time: 0.342, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0864, decode.acc_seg: 95.9418, aux.loss_ce: 0.0525, aux.acc_seg: 94.0299, loss: 0.1389, grad_norm: 1.0310
2023-02-19 16:41:23,013 - mmseg - INFO - Iter [159050/160000]	lr: 3.566e-07, eta: 0:04:31, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0925, decode.acc_seg: 95.8084, aux.loss_ce: 0.0565, aux.acc_seg: 93.8367, loss: 0.1490, grad_norm: 1.1052
2023-02-19 16:41:36,765 - mmseg - INFO - Iter [159100/160000]	lr: 3.379e-07, eta: 0:04:17, time: 0.275, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0932, decode.acc_seg: 95.8627, aux.loss_ce: 0.0561, aux.acc_seg: 93.8983, loss: 0.1494, grad_norm: 1.5214
2023-02-19 16:41:52,691 - mmseg - INFO - Iter [159150/160000]	lr: 3.191e-07, eta: 0:04:02, time: 0.319, data_time: 0.047, memory: 15214, decode.loss_ce: 0.0914, decode.acc_seg: 95.8800, aux.loss_ce: 0.0549, aux.acc_seg: 93.9106, loss: 0.1463, grad_norm: 1.0868
2023-02-19 16:42:06,284 - mmseg - INFO - Iter [159200/160000]	lr: 3.004e-07, eta: 0:03:48, time: 0.272, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0872, decode.acc_seg: 96.0415, aux.loss_ce: 0.0537, aux.acc_seg: 94.0518, loss: 0.1409, grad_norm: 1.0151
2023-02-19 16:42:20,392 - mmseg - INFO - Iter [159250/160000]	lr: 2.816e-07, eta: 0:03:34, time: 0.282, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0926, decode.acc_seg: 95.8333, aux.loss_ce: 0.0556, aux.acc_seg: 93.9144, loss: 0.1483, grad_norm: 1.2531
2023-02-19 16:42:35,237 - mmseg - INFO - Iter [159300/160000]	lr: 2.629e-07, eta: 0:03:20, time: 0.296, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0891, decode.acc_seg: 95.9242, aux.loss_ce: 0.0540, aux.acc_seg: 94.0171, loss: 0.1432, grad_norm: 1.0144
2023-02-19 16:42:49,653 - mmseg - INFO - Iter [159350/160000]	lr: 2.441e-07, eta: 0:03:05, time: 0.289, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0888, decode.acc_seg: 96.0632, aux.loss_ce: 0.0541, aux.acc_seg: 94.1382, loss: 0.1429, grad_norm: 0.9512
2023-02-19 16:43:03,818 - mmseg - INFO - Iter [159400/160000]	lr: 2.254e-07, eta: 0:02:51, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0912, decode.acc_seg: 95.7916, aux.loss_ce: 0.0544, aux.acc_seg: 93.8952, loss: 0.1456, grad_norm: 1.1819
2023-02-19 16:43:17,964 - mmseg - INFO - Iter [159450/160000]	lr: 2.066e-07, eta: 0:02:37, time: 0.283, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0918, decode.acc_seg: 95.8743, aux.loss_ce: 0.0579, aux.acc_seg: 93.7512, loss: 0.1497, grad_norm: 1.4039
2023-02-19 16:43:31,910 - mmseg - INFO - Iter [159500/160000]	lr: 1.879e-07, eta: 0:02:22, time: 0.279, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0963, decode.acc_seg: 95.6928, aux.loss_ce: 0.0576, aux.acc_seg: 93.7111, loss: 0.1539, grad_norm: 1.4340
2023-02-19 16:43:45,703 - mmseg - INFO - Iter [159550/160000]	lr: 1.691e-07, eta: 0:02:08, time: 0.276, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0892, decode.acc_seg: 96.0070, aux.loss_ce: 0.0540, aux.acc_seg: 94.1678, loss: 0.1433, grad_norm: 1.2385
2023-02-19 16:43:59,355 - mmseg - INFO - Iter [159600/160000]	lr: 1.504e-07, eta: 0:01:54, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0941, decode.acc_seg: 95.7676, aux.loss_ce: 0.0546, aux.acc_seg: 94.0515, loss: 0.1487, grad_norm: 1.1132
2023-02-19 16:44:13,269 - mmseg - INFO - Iter [159650/160000]	lr: 1.316e-07, eta: 0:01:40, time: 0.278, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0911, decode.acc_seg: 95.7942, aux.loss_ce: 0.0521, aux.acc_seg: 94.1072, loss: 0.1432, grad_norm: 1.0500
2023-02-19 16:44:26,908 - mmseg - INFO - Iter [159700/160000]	lr: 1.129e-07, eta: 0:01:25, time: 0.273, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0954, decode.acc_seg: 95.6301, aux.loss_ce: 0.0569, aux.acc_seg: 93.7154, loss: 0.1523, grad_norm: 1.2003
2023-02-19 16:44:40,960 - mmseg - INFO - Iter [159750/160000]	lr: 9.413e-08, eta: 0:01:11, time: 0.281, data_time: 0.004, memory: 15214, decode.loss_ce: 0.0888, decode.acc_seg: 96.0215, aux.loss_ce: 0.0535, aux.acc_seg: 94.2141, loss: 0.1423, grad_norm: 1.0357
2023-02-19 16:44:55,210 - mmseg - INFO - Iter [159800/160000]	lr: 7.537e-08, eta: 0:00:57, time: 0.285, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0893, decode.acc_seg: 95.9901, aux.loss_ce: 0.0543, aux.acc_seg: 94.1280, loss: 0.1437, grad_norm: 1.1141
2023-02-19 16:45:09,718 - mmseg - INFO - Iter [159850/160000]	lr: 5.663e-08, eta: 0:00:42, time: 0.290, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0860, decode.acc_seg: 96.0637, aux.loss_ce: 0.0521, aux.acc_seg: 94.1858, loss: 0.1382, grad_norm: 1.1462
2023-02-19 16:45:24,049 - mmseg - INFO - Iter [159900/160000]	lr: 3.787e-08, eta: 0:00:28, time: 0.286, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0881, decode.acc_seg: 96.1012, aux.loss_ce: 0.0541, aux.acc_seg: 94.1242, loss: 0.1422, grad_norm: 1.1231
2023-02-19 16:45:38,042 - mmseg - INFO - Iter [159950/160000]	lr: 1.913e-08, eta: 0:00:14, time: 0.280, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0895, decode.acc_seg: 95.8867, aux.loss_ce: 0.0543, aux.acc_seg: 93.9523, loss: 0.1437, grad_norm: 1.0623
2023-02-19 16:45:52,000 - mmseg - INFO - Saving checkpoint at 160000 iterations
2023-02-19 16:45:55,329 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:45:55,329 - mmseg - INFO - Iter [160000/160000]	lr: 3.750e-10, eta: 0:00:00, time: 0.346, data_time: 0.005, memory: 15214, decode.loss_ce: 0.0918, decode.acc_seg: 95.9277, aux.loss_ce: 0.0541, aux.acc_seg: 94.1219, loss: 0.1459, grad_norm: 1.0862
2023-02-19 16:46:09,630 - mmseg - INFO - per class results:
2023-02-19 16:46:09,635 - mmseg - INFO - 
+---------------------+-------+-------+
|        Class        |  IoU  |  Acc  |
+---------------------+-------+-------+
|         wall        | 79.15 |  88.6 |
|       building      | 82.48 | 91.89 |
|         sky         | 94.42 | 98.02 |
|        floor        | 81.28 | 91.47 |
|         tree        | 75.67 | 88.48 |
|       ceiling       | 85.06 | 94.23 |
|         road        |  83.8 | 89.44 |
|         bed         | 91.28 | 96.35 |
|      windowpane     | 63.91 |  79.3 |
|        grass        | 67.16 | 82.72 |
|       cabinet       | 62.58 | 73.71 |
|       sidewalk      | 69.18 | 86.33 |
|        person       | 82.68 | 93.25 |
|        earth        | 36.83 | 50.54 |
|         door        | 54.96 | 68.77 |
|        table        | 66.03 | 77.73 |
|       mountain      | 59.59 | 72.08 |
|        plant        | 50.89 | 61.04 |
|       curtain       |  76.0 | 86.58 |
|        chair        | 64.91 |  79.7 |
|         car         | 85.92 | 92.67 |
|        water        | 55.61 | 69.69 |
|       painting      | 76.99 | 90.43 |
|         sofa        | 74.29 | 86.66 |
|        shelf        |  46.1 | 63.41 |
|        house        | 37.73 | 49.33 |
|         sea         | 63.43 | 84.48 |
|        mirror       | 72.62 | 81.53 |
|         rug         |  52.8 | 62.49 |
|        field        | 31.32 | 49.08 |
|       armchair      |  50.9 | 67.54 |
|         seat        | 64.15 | 85.37 |
|        fence        | 42.78 | 56.04 |
|         desk        | 54.76 | 73.01 |
|         rock        | 50.66 | 74.47 |
|       wardrobe      | 46.08 | 67.71 |
|         lamp        |  69.2 | 80.29 |
|       bathtub       | 81.96 | 85.21 |
|       railing       | 38.38 | 51.42 |
|       cushion       | 63.01 |  80.0 |
|         base        | 40.93 | 55.51 |
|         box         | 28.54 | 36.17 |
|        column       | 50.29 | 64.77 |
|      signboard      | 40.25 | 56.33 |
|   chest of drawers  | 39.81 | 59.27 |
|       counter       | 29.34 | 39.37 |
|         sand        | 57.34 | 79.67 |
|         sink        | 74.62 | 82.48 |
|      skyscraper     | 54.45 | 70.04 |
|      fireplace      | 76.92 |  92.9 |
|     refrigerator    | 84.03 | 93.92 |
|      grandstand     | 44.36 | 76.48 |
|         path        | 27.47 | 41.04 |
|        stairs       | 30.27 | 35.29 |
|        runway       | 67.65 | 88.98 |
|         case        |  49.6 |  71.4 |
|      pool table     | 93.97 | 96.83 |
|        pillow       | 56.92 | 65.75 |
|     screen door     | 83.87 | 88.45 |
|       stairway      | 31.63 | 41.97 |
|        river        | 10.02 | 22.31 |
|        bridge       | 66.69 | 79.04 |
|       bookcase      | 48.12 |  67.3 |
|        blind        | 46.14 | 51.36 |
|     coffee table    | 63.88 | 80.28 |
|        toilet       | 87.39 |  92.0 |
|        flower       | 44.28 | 61.99 |
|         book        | 43.51 | 61.29 |
|         hill        | 12.11 | 19.92 |
|        bench        |  48.7 | 53.95 |
|      countertop     | 54.37 | 77.57 |
|        stove        | 82.41 |  86.4 |
|         palm        | 56.66 | 80.64 |
|    kitchen island   | 48.84 |  76.7 |
|       computer      | 77.04 | 85.97 |
|     swivel chair    | 42.51 | 59.72 |
|         boat        | 52.03 | 57.34 |
|         bar         | 41.42 | 55.31 |
|    arcade machine   | 45.18 | 47.47 |
|        hovel        | 42.88 |  48.4 |
|         bus         | 91.81 | 96.85 |
|        towel        | 73.87 | 85.19 |
|        light        |  58.8 | 69.18 |
|        truck        | 41.24 |  51.4 |
|        tower        | 36.69 | 49.42 |
|      chandelier     | 73.29 | 87.48 |
|        awning       | 39.02 | 49.65 |
|     streetlight     | 34.53 | 45.12 |
|        booth        | 52.02 | 61.72 |
| television receiver |  73.6 | 81.08 |
|       airplane      |  62.4 | 69.02 |
|      dirt track     | 10.13 | 39.58 |
|       apparel       | 38.15 | 54.63 |
|         pole        | 25.89 | 38.21 |
|         land        |  5.0  |  6.49 |
|      bannister      |  15.6 | 19.78 |
|      escalator      | 47.73 | 65.59 |
|       ottoman       | 48.69 | 68.24 |
|        bottle       | 37.57 | 60.91 |
|        buffet       | 36.95 | 42.54 |
|        poster       | 28.03 | 39.73 |
|        stage        | 16.14 | 21.36 |
|         van         | 39.88 | 55.75 |
|         ship        | 61.72 | 89.63 |
|       fountain      | 24.56 | 24.84 |
|    conveyer belt    | 81.15 | 92.13 |
|        canopy       | 48.01 | 54.95 |
|        washer       | 71.75 | 74.27 |
|      plaything      | 31.83 | 44.28 |
|    swimming pool    |  55.5 | 70.78 |
|        stool        | 45.55 | 59.05 |
|        barrel       | 35.97 |  74.8 |
|        basket       | 42.85 | 60.57 |
|      waterfall      |  50.9 | 58.82 |
|         tent        | 90.35 | 98.18 |
|         bag         | 18.25 | 23.07 |
|       minibike      | 70.52 | 90.24 |
|        cradle       | 83.45 | 92.21 |
|         oven        | 43.47 | 67.08 |
|         ball        | 52.41 | 69.39 |
|         food        | 56.15 | 64.51 |
|         step        |  8.7  | 10.32 |
|         tank        | 61.62 | 64.18 |
|      trade name     | 25.73 | 32.65 |
|      microwave      |  73.5 | 81.14 |
|         pot         | 53.02 | 62.05 |
|        animal       | 59.33 | 63.13 |
|       bicycle       | 59.87 | 83.14 |
|         lake        | 49.96 | 58.93 |
|      dishwasher     | 71.05 | 80.74 |
|        screen       | 53.72 | 69.26 |
|       blanket       | 27.35 |  33.7 |
|      sculpture      | 72.82 | 85.62 |
|         hood        | 72.82 | 76.56 |
|        sconce       | 53.85 | 68.43 |
|         vase        | 43.74 | 63.73 |
|    traffic light    | 36.55 | 54.41 |
|         tray        | 18.31 | 28.47 |
|        ashcan       | 44.62 |  63.5 |
|         fan         | 70.93 | 83.11 |
|         pier        | 42.52 | 72.97 |
|      crt screen     |  1.44 |  4.06 |
|        plate        | 59.85 | 80.67 |
|       monitor       | 20.08 | 23.41 |
|    bulletin board   | 47.03 | 56.09 |
|        shower       | 12.27 |  20.6 |
|       radiator      | 74.84 | 81.87 |
|        glass        | 17.05 | 18.87 |
|        clock        |  45.4 | 52.85 |
|         flag        | 55.12 | 68.03 |
+---------------------+-------+-------+
2023-02-19 16:46:09,636 - mmseg - INFO - Summary:
2023-02-19 16:46:09,636 - mmseg - INFO - 
+-------+-------+-------+
|  aAcc |  mIoU |  mAcc |
+-------+-------+-------+
| 84.16 | 52.98 | 65.42 |
+-------+-------+-------+
2023-02-19 16:46:13,083 - mmseg - INFO - Now best checkpoint is saved as best_mIoU_iter_160000.pth.
2023-02-19 16:46:13,084 - mmseg - INFO - Best mIoU is 0.5298 at 160000 iter.
2023-02-19 16:46:13,084 - mmseg - INFO - Exp name: diffseg_swin_l_2x8_512x512_160k_ade20k_v20.py
2023-02-19 16:46:13,084 - mmseg - INFO - Iter(val) [250]	aAcc: 0.8416, mIoU: 0.5298, mAcc: 0.6542, IoU.wall: 0.7915, IoU.building: 0.8248, IoU.sky: 0.9442, IoU.floor: 0.8128, IoU.tree: 0.7567, IoU.ceiling: 0.8506, IoU.road: 0.8380, IoU.bed : 0.9128, IoU.windowpane: 0.6391, IoU.grass: 0.6716, IoU.cabinet: 0.6258, IoU.sidewalk: 0.6918, IoU.person: 0.8268, IoU.earth: 0.3683, IoU.door: 0.5496, IoU.table: 0.6603, IoU.mountain: 0.5959, IoU.plant: 0.5089, IoU.curtain: 0.7600, IoU.chair: 0.6491, IoU.car: 0.8592, IoU.water: 0.5561, IoU.painting: 0.7699, IoU.sofa: 0.7429, IoU.shelf: 0.4610, IoU.house: 0.3773, IoU.sea: 0.6343, IoU.mirror: 0.7262, IoU.rug: 0.5280, IoU.field: 0.3132, IoU.armchair: 0.5090, IoU.seat: 0.6415, IoU.fence: 0.4278, IoU.desk: 0.5476, IoU.rock: 0.5066, IoU.wardrobe: 0.4608, IoU.lamp: 0.6920, IoU.bathtub: 0.8196, IoU.railing: 0.3838, IoU.cushion: 0.6301, IoU.base: 0.4093, IoU.box: 0.2854, IoU.column: 0.5029, IoU.signboard: 0.4025, IoU.chest of drawers: 0.3981, IoU.counter: 0.2934, IoU.sand: 0.5734, IoU.sink: 0.7462, IoU.skyscraper: 0.5445, IoU.fireplace: 0.7692, IoU.refrigerator: 0.8403, IoU.grandstand: 0.4436, IoU.path: 0.2747, IoU.stairs: 0.3027, IoU.runway: 0.6765, IoU.case: 0.4960, IoU.pool table: 0.9397, IoU.pillow: 0.5692, IoU.screen door: 0.8387, IoU.stairway: 0.3163, IoU.river: 0.1002, IoU.bridge: 0.6669, IoU.bookcase: 0.4812, IoU.blind: 0.4614, IoU.coffee table: 0.6388, IoU.toilet: 0.8739, IoU.flower: 0.4428, IoU.book: 0.4351, IoU.hill: 0.1211, IoU.bench: 0.4870, IoU.countertop: 0.5437, IoU.stove: 0.8241, IoU.palm: 0.5666, IoU.kitchen island: 0.4884, IoU.computer: 0.7704, IoU.swivel chair: 0.4251, IoU.boat: 0.5203, IoU.bar: 0.4142, IoU.arcade machine: 0.4518, IoU.hovel: 0.4288, IoU.bus: 0.9181, IoU.towel: 0.7387, IoU.light: 0.5880, IoU.truck: 0.4124, IoU.tower: 0.3669, IoU.chandelier: 0.7329, IoU.awning: 0.3902, IoU.streetlight: 0.3453, IoU.booth: 0.5202, IoU.television receiver: 0.7360, IoU.airplane: 0.6240, IoU.dirt track: 0.1013, IoU.apparel: 0.3815, IoU.pole: 0.2589, IoU.land: 0.0500, IoU.bannister: 0.1560, IoU.escalator: 0.4773, IoU.ottoman: 0.4869, IoU.bottle: 0.3757, IoU.buffet: 0.3695, IoU.poster: 0.2803, IoU.stage: 0.1614, IoU.van: 0.3988, IoU.ship: 0.6172, IoU.fountain: 0.2456, IoU.conveyer belt: 0.8115, IoU.canopy: 0.4801, IoU.washer: 0.7175, IoU.plaything: 0.3183, IoU.swimming pool: 0.5550, IoU.stool: 0.4555, IoU.barrel: 0.3597, IoU.basket: 0.4285, IoU.waterfall: 0.5090, IoU.tent: 0.9035, IoU.bag: 0.1825, IoU.minibike: 0.7052, IoU.cradle: 0.8345, IoU.oven: 0.4347, IoU.ball: 0.5241, IoU.food: 0.5615, IoU.step: 0.0870, IoU.tank: 0.6162, IoU.trade name: 0.2573, IoU.microwave: 0.7350, IoU.pot: 0.5302, IoU.animal: 0.5933, IoU.bicycle: 0.5987, IoU.lake: 0.4996, IoU.dishwasher: 0.7105, IoU.screen: 0.5372, IoU.blanket: 0.2735, IoU.sculpture: 0.7282, IoU.hood: 0.7282, IoU.sconce: 0.5385, IoU.vase: 0.4374, IoU.traffic light: 0.3655, IoU.tray: 0.1831, IoU.ashcan: 0.4462, IoU.fan: 0.7093, IoU.pier: 0.4252, IoU.crt screen: 0.0144, IoU.plate: 0.5985, IoU.monitor: 0.2008, IoU.bulletin board: 0.4703, IoU.shower: 0.1227, IoU.radiator: 0.7484, IoU.glass: 0.1705, IoU.clock: 0.4540, IoU.flag: 0.5512, Acc.wall: 0.8860, Acc.building: 0.9189, Acc.sky: 0.9802, Acc.floor: 0.9147, Acc.tree: 0.8848, Acc.ceiling: 0.9423, Acc.road: 0.8944, Acc.bed : 0.9635, Acc.windowpane: 0.7930, Acc.grass: 0.8272, Acc.cabinet: 0.7371, Acc.sidewalk: 0.8633, Acc.person: 0.9325, Acc.earth: 0.5054, Acc.door: 0.6877, Acc.table: 0.7773, Acc.mountain: 0.7208, Acc.plant: 0.6104, Acc.curtain: 0.8658, Acc.chair: 0.7970, Acc.car: 0.9267, Acc.water: 0.6969, Acc.painting: 0.9043, Acc.sofa: 0.8666, Acc.shelf: 0.6341, Acc.house: 0.4933, Acc.sea: 0.8448, Acc.mirror: 0.8153, Acc.rug: 0.6249, Acc.field: 0.4908, Acc.armchair: 0.6754, Acc.seat: 0.8537, Acc.fence: 0.5604, Acc.desk: 0.7301, Acc.rock: 0.7447, Acc.wardrobe: 0.6771, Acc.lamp: 0.8029, Acc.bathtub: 0.8521, Acc.railing: 0.5142, Acc.cushion: 0.8000, Acc.base: 0.5551, Acc.box: 0.3617, Acc.column: 0.6477, Acc.signboard: 0.5633, Acc.chest of drawers: 0.5927, Acc.counter: 0.3937, Acc.sand: 0.7967, Acc.sink: 0.8248, Acc.skyscraper: 0.7004, Acc.fireplace: 0.9290, Acc.refrigerator: 0.9392, Acc.grandstand: 0.7648, Acc.path: 0.4104, Acc.stairs: 0.3529, Acc.runway: 0.8898, Acc.case: 0.7140, Acc.pool table: 0.9683, Acc.pillow: 0.6575, Acc.screen door: 0.8845, Acc.stairway: 0.4197, Acc.river: 0.2231, Acc.bridge: 0.7904, Acc.bookcase: 0.6730, Acc.blind: 0.5136, Acc.coffee table: 0.8028, Acc.toilet: 0.9200, Acc.flower: 0.6199, Acc.book: 0.6129, Acc.hill: 0.1992, Acc.bench: 0.5395, Acc.countertop: 0.7757, Acc.stove: 0.8640, Acc.palm: 0.8064, Acc.kitchen island: 0.7670, Acc.computer: 0.8597, Acc.swivel chair: 0.5972, Acc.boat: 0.5734, Acc.bar: 0.5531, Acc.arcade machine: 0.4747, Acc.hovel: 0.4840, Acc.bus: 0.9685, Acc.towel: 0.8519, Acc.light: 0.6918, Acc.truck: 0.5140, Acc.tower: 0.4942, Acc.chandelier: 0.8748, Acc.awning: 0.4965, Acc.streetlight: 0.4512, Acc.booth: 0.6172, Acc.television receiver: 0.8108, Acc.airplane: 0.6902, Acc.dirt track: 0.3958, Acc.apparel: 0.5463, Acc.pole: 0.3821, Acc.land: 0.0649, Acc.bannister: 0.1978, Acc.escalator: 0.6559, Acc.ottoman: 0.6824, Acc.bottle: 0.6091, Acc.buffet: 0.4254, Acc.poster: 0.3973, Acc.stage: 0.2136, Acc.van: 0.5575, Acc.ship: 0.8963, Acc.fountain: 0.2484, Acc.conveyer belt: 0.9213, Acc.canopy: 0.5495, Acc.washer: 0.7427, Acc.plaything: 0.4428, Acc.swimming pool: 0.7078, Acc.stool: 0.5905, Acc.barrel: 0.7480, Acc.basket: 0.6057, Acc.waterfall: 0.5882, Acc.tent: 0.9818, Acc.bag: 0.2307, Acc.minibike: 0.9024, Acc.cradle: 0.9221, Acc.oven: 0.6708, Acc.ball: 0.6939, Acc.food: 0.6451, Acc.step: 0.1032, Acc.tank: 0.6418, Acc.trade name: 0.3265, Acc.microwave: 0.8114, Acc.pot: 0.6205, Acc.animal: 0.6313, Acc.bicycle: 0.8314, Acc.lake: 0.5893, Acc.dishwasher: 0.8074, Acc.screen: 0.6926, Acc.blanket: 0.3370, Acc.sculpture: 0.8562, Acc.hood: 0.7656, Acc.sconce: 0.6843, Acc.vase: 0.6373, Acc.traffic light: 0.5441, Acc.tray: 0.2847, Acc.ashcan: 0.6350, Acc.fan: 0.8311, Acc.pier: 0.7297, Acc.crt screen: 0.0406, Acc.plate: 0.8067, Acc.monitor: 0.2341, Acc.bulletin board: 0.5609, Acc.shower: 0.2060, Acc.radiator: 0.8187, Acc.glass: 0.1887, Acc.clock: 0.5285, Acc.flag: 0.6803