2024-02-22 09:45:43,047 INFO [train.py:734] (3/4) Training started 2024-02-22 09:45:43,047 INFO [train.py:744] (3/4) Device: cuda:3 2024-02-22 09:45:43,048 INFO [train.py:750] (3/4) {'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': -1, 'log_interval': 50, 'valid_interval': 200, 'env_info': {'k2-version': '1.24.4', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': '44a9d5682af9fd3ef77074777e15278ec6d390eb', 'k2-git-date': 'Wed Sep 27 11:22:55 2023', 'lhotse-version': '0.0.0+unknown.version', 'torch-version': '1.10.0+cu102', 'torch-cuda-available': True, 'torch-cuda-version': '10.2', 'python-version': '3.8', 'icefall-git-branch': 'update-vits-tokenizer', 'icefall-git-sha1': '595d4a3c-clean', 'icefall-git-date': 'Wed Feb 21 17:50:19 2024', 'icefall-path': '/star-zw/workspace/tts/icefall_tts', 'k2-path': '/star-zw/workspace/k2/k2/k2/python/k2/__init__.py', 'lhotse-path': '/star-zw/workspace/lhotse/lhotse_dev/lhotse/__init__.py', 'hostname': 'de-74279-k2-train-7-1218101249-5bcbfb5567-jsftr', 'IP address': '10.177.6.147'}, 'sampling_rate': 22050, 'frame_shift': 256, 'frame_length': 1024, 'feature_dim': 513, 'n_mels': 80, 'lambda_adv': 1.0, 'lambda_mel': 45.0, 'lambda_feat_match': 2.0, 'lambda_dur': 1.0, 'lambda_kl': 1.0, 'world_size': 4, 'master_port': 12354, 'tensorboard': True, 'num_epochs': 1000, 'start_epoch': 1, 'exp_dir': PosixPath('vits/exp-new-tokenizer-add-sos-eos-4-gpu'), 'tokens': 'data/tokens.txt', 'lr': 0.0002, 'seed': 42, 'print_diagnostics': False, 'inf_check': False, 'save_every_n': 20, 'use_fp16': True, 'manifest_dir': PosixPath('data/spectrogram'), 'max_duration': 500, 'bucketing_sampler': True, 'num_buckets': 30, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': False, 'num_workers': 2, 'input_strategy': 'PrecomputedFeatures', 'blank_id': 0, 'vocab_size': 159} 2024-02-22 09:45:43,048 INFO [train.py:752] (3/4) About to create model 2024-02-22 09:45:45,445 INFO [train.py:758] (3/4) Number of parameters in generator: 35637106 2024-02-22 09:45:45,446 INFO [train.py:760] (3/4) Number of parameters in discriminator: 50974956 2024-02-22 09:45:45,446 INFO [train.py:761] (3/4) Total number of parameters: 86612062 2024-02-22 09:45:50,129 INFO [train.py:768] (3/4) Using DDP 2024-02-22 09:45:50,808 INFO [tts_datamodule.py:310] (3/4) About to get train cuts 2024-02-22 09:45:51,150 INFO [tts_datamodule.py:169] (3/4) About to create train dataset 2024-02-22 09:45:51,150 INFO [tts_datamodule.py:193] (3/4) Using DynamicBucketingSampler. 2024-02-22 09:45:52,663 INFO [tts_datamodule.py:210] (3/4) About to create train dataloader 2024-02-22 09:45:52,664 INFO [tts_datamodule.py:317] (3/4) About to get validation cuts 2024-02-22 09:45:52,665 INFO [tts_datamodule.py:233] (3/4) About to create dev dataset 2024-02-22 09:45:52,671 INFO [tts_datamodule.py:260] (3/4) About to create valid dataloader 2024-02-22 09:45:52,672 INFO [train.py:662] (3/4) Sanity check -- see if any of the batches in epoch 1 would cause OOM. 2024-02-22 09:46:00,126 INFO [train.py:709] (3/4) Maximum memory allocated so far is 15292MB 2024-02-22 09:46:04,026 INFO [train.py:709] (3/4) Maximum memory allocated so far is 15636MB 2024-02-22 09:46:10,263 INFO [train.py:709] (3/4) Maximum memory allocated so far is 16021MB 2024-02-22 09:46:14,699 INFO [train.py:709] (3/4) Maximum memory allocated so far is 16032MB 2024-02-22 09:46:25,873 INFO [train.py:709] (3/4) Maximum memory allocated so far is 27028MB 2024-02-22 09:46:34,173 INFO [train.py:709] (3/4) Maximum memory allocated so far is 27032MB 2024-02-22 09:46:34,200 INFO [train.py:845] (3/4) Start epoch 1 2024-02-22 09:46:46,808 INFO [train.py:471] (3/4) Epoch 1, batch 0, global_batch_idx: 0, batch size: 50, loss[discriminator_loss=6, discriminator_real_loss=6, discriminator_fake_loss=0.0009675, generator_loss=1061, generator_mel_loss=80.52, generator_kl_loss=973.5, generator_dur_loss=1.859, generator_adv_loss=4.754, generator_feat_match_loss=0.2998, over 50.00 samples.], tot_loss[discriminator_loss=6, discriminator_real_loss=6, discriminator_fake_loss=0.0009675, generator_loss=1061, generator_mel_loss=80.52, generator_kl_loss=973.5, generator_dur_loss=1.859, generator_adv_loss=4.754, generator_feat_match_loss=0.2998, over 50.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 2.0 2024-02-22 09:46:46,808 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 09:47:04,235 INFO [train.py:534] (3/4) Epoch 1, validation: discriminator_loss=4.831, discriminator_real_loss=4.75, discriminator_fake_loss=0.08159, generator_loss=531.1, generator_mel_loss=71.89, generator_kl_loss=452.3, generator_dur_loss=1.85, generator_adv_loss=4.756, generator_feat_match_loss=0.3225, over 100.00 samples. 2024-02-22 09:47:04,236 INFO [train.py:535] (3/4) Maximum memory allocated so far is 27032MB 2024-02-22 09:50:16,934 INFO [train.py:845] (3/4) Start epoch 2 2024-02-22 09:51:45,699 INFO [train.py:471] (3/4) Epoch 2, batch 13, global_batch_idx: 50, batch size: 60, loss[discriminator_loss=3.197, discriminator_real_loss=2.559, discriminator_fake_loss=0.6387, generator_loss=64.46, generator_mel_loss=44.46, generator_kl_loss=13.43, generator_dur_loss=1.597, generator_adv_loss=2.717, generator_feat_match_loss=2.264, over 60.00 samples.], tot_loss[discriminator_loss=2.7, discriminator_real_loss=1.629, discriminator_fake_loss=1.072, generator_loss=66.34, generator_mel_loss=43.77, generator_kl_loss=15.54, generator_dur_loss=1.583, generator_adv_loss=2.579, generator_feat_match_loss=2.865, over 974.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 2.0 2024-02-22 09:53:43,535 INFO [train.py:845] (3/4) Start epoch 3 2024-02-22 09:56:04,456 INFO [train.py:471] (3/4) Epoch 3, batch 26, global_batch_idx: 100, batch size: 52, loss[discriminator_loss=3.174, discriminator_real_loss=2.047, discriminator_fake_loss=1.127, generator_loss=51.59, generator_mel_loss=37.88, generator_kl_loss=6.427, generator_dur_loss=1.652, generator_adv_loss=3.426, generator_feat_match_loss=2.201, over 52.00 samples.], tot_loss[discriminator_loss=2.815, discriminator_real_loss=1.598, discriminator_fake_loss=1.217, generator_loss=54.7, generator_mel_loss=40.06, generator_kl_loss=7.453, generator_dur_loss=1.64, generator_adv_loss=2.595, generator_feat_match_loss=2.948, over 1972.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 4.0 2024-02-22 09:57:09,526 INFO [train.py:845] (3/4) Start epoch 4 2024-02-22 10:00:39,550 INFO [train.py:845] (3/4) Start epoch 5 2024-02-22 10:01:06,427 INFO [train.py:471] (3/4) Epoch 5, batch 2, global_batch_idx: 150, batch size: 61, loss[discriminator_loss=3.242, discriminator_real_loss=1.986, discriminator_fake_loss=1.256, generator_loss=46.24, generator_mel_loss=36.6, generator_kl_loss=4.097, generator_dur_loss=1.68, generator_adv_loss=1.82, generator_feat_match_loss=2.035, over 61.00 samples.], tot_loss[discriminator_loss=3.457, discriminator_real_loss=2.195, discriminator_fake_loss=1.261, generator_loss=47.53, generator_mel_loss=37.85, generator_kl_loss=4.142, generator_dur_loss=1.678, generator_adv_loss=1.825, generator_feat_match_loss=2.04, over 225.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 4.0 2024-02-22 10:04:07,804 INFO [train.py:845] (3/4) Start epoch 6 2024-02-22 10:05:45,751 INFO [train.py:471] (3/4) Epoch 6, batch 15, global_batch_idx: 200, batch size: 49, loss[discriminator_loss=2.736, discriminator_real_loss=1.385, discriminator_fake_loss=1.352, generator_loss=44.06, generator_mel_loss=35.47, generator_kl_loss=3.21, generator_dur_loss=1.683, generator_adv_loss=1.875, generator_feat_match_loss=1.822, over 49.00 samples.], tot_loss[discriminator_loss=2.867, discriminator_real_loss=1.506, discriminator_fake_loss=1.36, generator_loss=45.26, generator_mel_loss=36.82, generator_kl_loss=3.337, generator_dur_loss=1.664, generator_adv_loss=1.794, generator_feat_match_loss=1.641, over 1247.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 8.0 2024-02-22 10:05:45,753 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 10:05:53,955 INFO [train.py:534] (3/4) Epoch 6, validation: discriminator_loss=2.526, discriminator_real_loss=1.288, discriminator_fake_loss=1.238, generator_loss=46.21, generator_mel_loss=37.16, generator_kl_loss=3.183, generator_dur_loss=1.709, generator_adv_loss=1.894, generator_feat_match_loss=2.263, over 100.00 samples. 2024-02-22 10:05:53,957 INFO [train.py:535] (3/4) Maximum memory allocated so far is 27880MB 2024-02-22 10:07:46,281 INFO [train.py:845] (3/4) Start epoch 7 2024-02-22 10:10:29,716 INFO [train.py:471] (3/4) Epoch 7, batch 28, global_batch_idx: 250, batch size: 90, loss[discriminator_loss=2.848, discriminator_real_loss=1.322, discriminator_fake_loss=1.525, generator_loss=45.03, generator_mel_loss=36.31, generator_kl_loss=2.804, generator_dur_loss=1.674, generator_adv_loss=2.133, generator_feat_match_loss=2.113, over 90.00 samples.], tot_loss[discriminator_loss=2.642, discriminator_real_loss=1.409, discriminator_fake_loss=1.233, generator_loss=45.07, generator_mel_loss=36.1, generator_kl_loss=2.814, generator_dur_loss=1.676, generator_adv_loss=2.103, generator_feat_match_loss=2.376, over 2066.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 8.0 2024-02-22 10:11:12,854 INFO [train.py:845] (3/4) Start epoch 8 2024-02-22 10:14:37,941 INFO [train.py:845] (3/4) Start epoch 9 2024-02-22 10:15:14,759 INFO [train.py:471] (3/4) Epoch 9, batch 4, global_batch_idx: 300, batch size: 69, loss[discriminator_loss=2.512, discriminator_real_loss=1.384, discriminator_fake_loss=1.127, generator_loss=42.88, generator_mel_loss=34.65, generator_kl_loss=2.282, generator_dur_loss=1.668, generator_adv_loss=1.808, generator_feat_match_loss=2.473, over 69.00 samples.], tot_loss[discriminator_loss=2.633, discriminator_real_loss=1.325, discriminator_fake_loss=1.308, generator_loss=43.47, generator_mel_loss=35.16, generator_kl_loss=2.427, generator_dur_loss=1.654, generator_adv_loss=2.027, generator_feat_match_loss=2.202, over 490.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 8.0 2024-02-22 10:18:11,180 INFO [train.py:845] (3/4) Start epoch 10 2024-02-22 10:19:57,821 INFO [train.py:471] (3/4) Epoch 10, batch 17, global_batch_idx: 350, batch size: 126, loss[discriminator_loss=2.742, discriminator_real_loss=1.709, discriminator_fake_loss=1.032, generator_loss=41.67, generator_mel_loss=33.78, generator_kl_loss=2.135, generator_dur_loss=1.659, generator_adv_loss=1.765, generator_feat_match_loss=2.334, over 126.00 samples.], tot_loss[discriminator_loss=2.668, discriminator_real_loss=1.398, discriminator_fake_loss=1.27, generator_loss=42.6, generator_mel_loss=34.6, generator_kl_loss=2.245, generator_dur_loss=1.68, generator_adv_loss=1.969, generator_feat_match_loss=2.107, over 1322.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 8.0 2024-02-22 10:21:40,214 INFO [train.py:845] (3/4) Start epoch 11 2024-02-22 10:24:31,437 INFO [train.py:471] (3/4) Epoch 11, batch 30, global_batch_idx: 400, batch size: 51, loss[discriminator_loss=2.57, discriminator_real_loss=1.277, discriminator_fake_loss=1.294, generator_loss=40.47, generator_mel_loss=32.05, generator_kl_loss=2.105, generator_dur_loss=1.699, generator_adv_loss=2.129, generator_feat_match_loss=2.484, over 51.00 samples.], tot_loss[discriminator_loss=2.429, discriminator_real_loss=1.331, discriminator_fake_loss=1.097, generator_loss=43.14, generator_mel_loss=33.98, generator_kl_loss=2.169, generator_dur_loss=1.679, generator_adv_loss=2.262, generator_feat_match_loss=3.052, over 2278.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2024-02-22 10:24:31,438 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 10:24:39,870 INFO [train.py:534] (3/4) Epoch 11, validation: discriminator_loss=2.268, discriminator_real_loss=1.115, discriminator_fake_loss=1.153, generator_loss=43.03, generator_mel_loss=33.87, generator_kl_loss=2.199, generator_dur_loss=1.704, generator_adv_loss=2.041, generator_feat_match_loss=3.221, over 100.00 samples. 2024-02-22 10:24:39,871 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28046MB 2024-02-22 10:25:16,404 INFO [train.py:845] (3/4) Start epoch 12 2024-02-22 10:28:45,720 INFO [train.py:845] (3/4) Start epoch 13 2024-02-22 10:29:32,807 INFO [train.py:471] (3/4) Epoch 13, batch 6, global_batch_idx: 450, batch size: 73, loss[discriminator_loss=2.68, discriminator_real_loss=1.414, discriminator_fake_loss=1.265, generator_loss=42.22, generator_mel_loss=34.21, generator_kl_loss=2.052, generator_dur_loss=1.684, generator_adv_loss=2.07, generator_feat_match_loss=2.205, over 73.00 samples.], tot_loss[discriminator_loss=2.632, discriminator_real_loss=1.433, discriminator_fake_loss=1.199, generator_loss=41.19, generator_mel_loss=33.05, generator_kl_loss=2.006, generator_dur_loss=1.695, generator_adv_loss=2.09, generator_feat_match_loss=2.347, over 495.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2024-02-22 10:32:18,421 INFO [train.py:845] (3/4) Start epoch 14 2024-02-22 10:34:05,479 INFO [train.py:471] (3/4) Epoch 14, batch 19, global_batch_idx: 500, batch size: 54, loss[discriminator_loss=3.02, discriminator_real_loss=1.631, discriminator_fake_loss=1.388, generator_loss=39.23, generator_mel_loss=32.34, generator_kl_loss=2.036, generator_dur_loss=1.731, generator_adv_loss=1.745, generator_feat_match_loss=1.372, over 54.00 samples.], tot_loss[discriminator_loss=2.691, discriminator_real_loss=1.434, discriminator_fake_loss=1.257, generator_loss=41.44, generator_mel_loss=33.18, generator_kl_loss=2.024, generator_dur_loss=1.688, generator_adv_loss=2.145, generator_feat_match_loss=2.408, over 1367.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2024-02-22 10:35:41,672 INFO [train.py:845] (3/4) Start epoch 15 2024-02-22 10:38:42,733 INFO [train.py:471] (3/4) Epoch 15, batch 32, global_batch_idx: 550, batch size: 69, loss[discriminator_loss=2.689, discriminator_real_loss=1.373, discriminator_fake_loss=1.316, generator_loss=38.76, generator_mel_loss=31.36, generator_kl_loss=2.049, generator_dur_loss=1.68, generator_adv_loss=1.811, generator_feat_match_loss=1.861, over 69.00 samples.], tot_loss[discriminator_loss=2.786, discriminator_real_loss=1.476, discriminator_fake_loss=1.31, generator_loss=40.18, generator_mel_loss=32.56, generator_kl_loss=2.055, generator_dur_loss=1.686, generator_adv_loss=1.917, generator_feat_match_loss=1.962, over 2268.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2024-02-22 10:39:06,425 INFO [train.py:845] (3/4) Start epoch 16 2024-02-22 10:42:35,435 INFO [train.py:845] (3/4) Start epoch 17 2024-02-22 10:43:38,004 INFO [train.py:471] (3/4) Epoch 17, batch 8, global_batch_idx: 600, batch size: 59, loss[discriminator_loss=2.715, discriminator_real_loss=1.497, discriminator_fake_loss=1.218, generator_loss=40.25, generator_mel_loss=32.87, generator_kl_loss=2.027, generator_dur_loss=1.704, generator_adv_loss=1.784, generator_feat_match_loss=1.869, over 59.00 samples.], tot_loss[discriminator_loss=2.794, discriminator_real_loss=1.442, discriminator_fake_loss=1.352, generator_loss=39.86, generator_mel_loss=32.33, generator_kl_loss=2.092, generator_dur_loss=1.669, generator_adv_loss=1.814, generator_feat_match_loss=1.958, over 735.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2024-02-22 10:43:38,006 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 10:43:45,983 INFO [train.py:534] (3/4) Epoch 17, validation: discriminator_loss=2.629, discriminator_real_loss=1.268, discriminator_fake_loss=1.361, generator_loss=41.05, generator_mel_loss=33.52, generator_kl_loss=2.181, generator_dur_loss=1.681, generator_adv_loss=1.665, generator_feat_match_loss=2.001, over 100.00 samples. 2024-02-22 10:43:45,984 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28046MB 2024-02-22 10:46:18,530 INFO [train.py:845] (3/4) Start epoch 18 2024-02-22 10:48:26,433 INFO [train.py:471] (3/4) Epoch 18, batch 21, global_batch_idx: 650, batch size: 85, loss[discriminator_loss=2.641, discriminator_real_loss=1.49, discriminator_fake_loss=1.15, generator_loss=39.33, generator_mel_loss=31.68, generator_kl_loss=1.969, generator_dur_loss=1.655, generator_adv_loss=1.896, generator_feat_match_loss=2.133, over 85.00 samples.], tot_loss[discriminator_loss=2.66, discriminator_real_loss=1.387, discriminator_fake_loss=1.273, generator_loss=39.42, generator_mel_loss=31.45, generator_kl_loss=2.065, generator_dur_loss=1.685, generator_adv_loss=2.019, generator_feat_match_loss=2.199, over 1532.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2024-02-22 10:49:52,433 INFO [train.py:845] (3/4) Start epoch 19 2024-02-22 10:53:05,153 INFO [train.py:471] (3/4) Epoch 19, batch 34, global_batch_idx: 700, batch size: 126, loss[discriminator_loss=2.506, discriminator_real_loss=1.135, discriminator_fake_loss=1.371, generator_loss=40.26, generator_mel_loss=31.76, generator_kl_loss=2.013, generator_dur_loss=1.644, generator_adv_loss=2.242, generator_feat_match_loss=2.604, over 126.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.367, discriminator_fake_loss=1.249, generator_loss=39.5, generator_mel_loss=31.39, generator_kl_loss=2.044, generator_dur_loss=1.689, generator_adv_loss=2.015, generator_feat_match_loss=2.358, over 2400.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2024-02-22 10:53:20,895 INFO [train.py:845] (3/4) Start epoch 20 2024-02-22 10:56:52,280 INFO [train.py:845] (3/4) Start epoch 21 2024-02-22 10:57:56,163 INFO [train.py:471] (3/4) Epoch 21, batch 10, global_batch_idx: 750, batch size: 63, loss[discriminator_loss=2.574, discriminator_real_loss=1.402, discriminator_fake_loss=1.172, generator_loss=38.89, generator_mel_loss=31.04, generator_kl_loss=1.947, generator_dur_loss=1.706, generator_adv_loss=1.857, generator_feat_match_loss=2.342, over 63.00 samples.], tot_loss[discriminator_loss=2.608, discriminator_real_loss=1.35, discriminator_fake_loss=1.258, generator_loss=38.89, generator_mel_loss=30.8, generator_kl_loss=2.011, generator_dur_loss=1.674, generator_adv_loss=2.015, generator_feat_match_loss=2.387, over 862.00 samples.], cur_lr_g: 2.00e-04, cur_lr_d: 2.00e-04, grad_scale: 16.0 2024-02-22 11:00:25,443 INFO [train.py:845] (3/4) Start epoch 22 2024-02-22 11:02:42,529 INFO [train.py:471] (3/4) Epoch 22, batch 23, global_batch_idx: 800, batch size: 71, loss[discriminator_loss=2.678, discriminator_real_loss=1.138, discriminator_fake_loss=1.54, generator_loss=39.18, generator_mel_loss=31.08, generator_kl_loss=1.966, generator_dur_loss=1.696, generator_adv_loss=2.27, generator_feat_match_loss=2.17, over 71.00 samples.], tot_loss[discriminator_loss=2.643, discriminator_real_loss=1.375, discriminator_fake_loss=1.268, generator_loss=38.87, generator_mel_loss=30.81, generator_kl_loss=1.979, generator_dur_loss=1.682, generator_adv_loss=2.042, generator_feat_match_loss=2.362, over 1758.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:02:42,530 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 11:02:51,249 INFO [train.py:534] (3/4) Epoch 22, validation: discriminator_loss=2.629, discriminator_real_loss=1.587, discriminator_fake_loss=1.042, generator_loss=39.72, generator_mel_loss=31.3, generator_kl_loss=2.193, generator_dur_loss=1.675, generator_adv_loss=2.275, generator_feat_match_loss=2.276, over 100.00 samples. 2024-02-22 11:02:51,250 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28046MB 2024-02-22 11:04:02,870 INFO [train.py:845] (3/4) Start epoch 23 2024-02-22 11:07:36,609 INFO [train.py:471] (3/4) Epoch 23, batch 36, global_batch_idx: 850, batch size: 65, loss[discriminator_loss=2.568, discriminator_real_loss=1.228, discriminator_fake_loss=1.341, generator_loss=38.39, generator_mel_loss=30.33, generator_kl_loss=2.068, generator_dur_loss=1.713, generator_adv_loss=1.916, generator_feat_match_loss=2.361, over 65.00 samples.], tot_loss[discriminator_loss=2.682, discriminator_real_loss=1.417, discriminator_fake_loss=1.265, generator_loss=38.59, generator_mel_loss=30.72, generator_kl_loss=1.982, generator_dur_loss=1.683, generator_adv_loss=1.983, generator_feat_match_loss=2.22, over 2847.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:07:37,089 INFO [train.py:845] (3/4) Start epoch 24 2024-02-22 11:10:59,691 INFO [train.py:845] (3/4) Start epoch 25 2024-02-22 11:12:17,890 INFO [train.py:471] (3/4) Epoch 25, batch 12, global_batch_idx: 900, batch size: 69, loss[discriminator_loss=2.701, discriminator_real_loss=1.349, discriminator_fake_loss=1.353, generator_loss=38.21, generator_mel_loss=30.2, generator_kl_loss=1.909, generator_dur_loss=1.697, generator_adv_loss=2.174, generator_feat_match_loss=2.238, over 69.00 samples.], tot_loss[discriminator_loss=2.66, discriminator_real_loss=1.421, discriminator_fake_loss=1.239, generator_loss=38.43, generator_mel_loss=30.31, generator_kl_loss=1.933, generator_dur_loss=1.685, generator_adv_loss=2.09, generator_feat_match_loss=2.408, over 899.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:14:24,553 INFO [train.py:845] (3/4) Start epoch 26 2024-02-22 11:16:51,711 INFO [train.py:471] (3/4) Epoch 26, batch 25, global_batch_idx: 950, batch size: 101, loss[discriminator_loss=2.699, discriminator_real_loss=1.496, discriminator_fake_loss=1.204, generator_loss=38.07, generator_mel_loss=30.04, generator_kl_loss=2.058, generator_dur_loss=1.651, generator_adv_loss=2.051, generator_feat_match_loss=2.275, over 101.00 samples.], tot_loss[discriminator_loss=2.646, discriminator_real_loss=1.407, discriminator_fake_loss=1.24, generator_loss=38.09, generator_mel_loss=30.05, generator_kl_loss=1.951, generator_dur_loss=1.692, generator_adv_loss=2.033, generator_feat_match_loss=2.368, over 1714.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:17:55,482 INFO [train.py:845] (3/4) Start epoch 27 2024-02-22 11:21:23,427 INFO [train.py:845] (3/4) Start epoch 28 2024-02-22 11:21:41,621 INFO [train.py:471] (3/4) Epoch 28, batch 1, global_batch_idx: 1000, batch size: 110, loss[discriminator_loss=2.715, discriminator_real_loss=1.38, discriminator_fake_loss=1.334, generator_loss=38.4, generator_mel_loss=30.45, generator_kl_loss=1.946, generator_dur_loss=1.67, generator_adv_loss=2.131, generator_feat_match_loss=2.203, over 110.00 samples.], tot_loss[discriminator_loss=2.712, discriminator_real_loss=1.427, discriminator_fake_loss=1.285, generator_loss=37.96, generator_mel_loss=30.1, generator_kl_loss=1.932, generator_dur_loss=1.678, generator_adv_loss=2.047, generator_feat_match_loss=2.203, over 189.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:21:41,622 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 11:21:50,650 INFO [train.py:534] (3/4) Epoch 28, validation: discriminator_loss=2.567, discriminator_real_loss=1.44, discriminator_fake_loss=1.127, generator_loss=39.1, generator_mel_loss=30.63, generator_kl_loss=2.251, generator_dur_loss=1.683, generator_adv_loss=2.122, generator_feat_match_loss=2.423, over 100.00 samples. 2024-02-22 11:21:50,651 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28046MB 2024-02-22 11:25:05,319 INFO [train.py:845] (3/4) Start epoch 29 2024-02-22 11:26:33,688 INFO [train.py:471] (3/4) Epoch 29, batch 14, global_batch_idx: 1050, batch size: 101, loss[discriminator_loss=2.668, discriminator_real_loss=1.547, discriminator_fake_loss=1.121, generator_loss=36.84, generator_mel_loss=29.14, generator_kl_loss=1.96, generator_dur_loss=1.659, generator_adv_loss=1.893, generator_feat_match_loss=2.193, over 101.00 samples.], tot_loss[discriminator_loss=2.64, discriminator_real_loss=1.378, discriminator_fake_loss=1.263, generator_loss=37.55, generator_mel_loss=29.54, generator_kl_loss=1.928, generator_dur_loss=1.677, generator_adv_loss=2.024, generator_feat_match_loss=2.385, over 1180.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:28:32,623 INFO [train.py:845] (3/4) Start epoch 30 2024-02-22 11:31:07,097 INFO [train.py:471] (3/4) Epoch 30, batch 27, global_batch_idx: 1100, batch size: 101, loss[discriminator_loss=2.654, discriminator_real_loss=1.428, discriminator_fake_loss=1.227, generator_loss=37.51, generator_mel_loss=29.43, generator_kl_loss=1.911, generator_dur_loss=1.644, generator_adv_loss=2.066, generator_feat_match_loss=2.461, over 101.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.395, discriminator_fake_loss=1.249, generator_loss=37.94, generator_mel_loss=29.79, generator_kl_loss=1.953, generator_dur_loss=1.682, generator_adv_loss=2.047, generator_feat_match_loss=2.472, over 2218.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:31:57,535 INFO [train.py:845] (3/4) Start epoch 31 2024-02-22 11:35:23,030 INFO [train.py:845] (3/4) Start epoch 32 2024-02-22 11:35:55,600 INFO [train.py:471] (3/4) Epoch 32, batch 3, global_batch_idx: 1150, batch size: 95, loss[discriminator_loss=2.57, discriminator_real_loss=1.191, discriminator_fake_loss=1.378, generator_loss=37.84, generator_mel_loss=29.76, generator_kl_loss=1.969, generator_dur_loss=1.657, generator_adv_loss=1.973, generator_feat_match_loss=2.482, over 95.00 samples.], tot_loss[discriminator_loss=2.659, discriminator_real_loss=1.429, discriminator_fake_loss=1.23, generator_loss=37.24, generator_mel_loss=29.28, generator_kl_loss=1.961, generator_dur_loss=1.667, generator_adv_loss=1.996, generator_feat_match_loss=2.333, over 355.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:38:46,519 INFO [train.py:845] (3/4) Start epoch 33 2024-02-22 11:40:33,043 INFO [train.py:471] (3/4) Epoch 33, batch 16, global_batch_idx: 1200, batch size: 52, loss[discriminator_loss=2.627, discriminator_real_loss=1.394, discriminator_fake_loss=1.233, generator_loss=37.14, generator_mel_loss=29.22, generator_kl_loss=1.948, generator_dur_loss=1.719, generator_adv_loss=1.841, generator_feat_match_loss=2.41, over 52.00 samples.], tot_loss[discriminator_loss=2.647, discriminator_real_loss=1.416, discriminator_fake_loss=1.232, generator_loss=37.25, generator_mel_loss=29.15, generator_kl_loss=1.947, generator_dur_loss=1.69, generator_adv_loss=2.029, generator_feat_match_loss=2.443, over 1259.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:40:33,044 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 11:40:42,365 INFO [train.py:534] (3/4) Epoch 33, validation: discriminator_loss=2.646, discriminator_real_loss=1.216, discriminator_fake_loss=1.43, generator_loss=38.43, generator_mel_loss=30.22, generator_kl_loss=2.239, generator_dur_loss=1.714, generator_adv_loss=1.686, generator_feat_match_loss=2.58, over 100.00 samples. 2024-02-22 11:40:42,366 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28046MB 2024-02-22 11:42:22,857 INFO [train.py:845] (3/4) Start epoch 34 2024-02-22 11:45:09,183 INFO [train.py:471] (3/4) Epoch 34, batch 29, global_batch_idx: 1250, batch size: 53, loss[discriminator_loss=2.582, discriminator_real_loss=1.213, discriminator_fake_loss=1.37, generator_loss=36.58, generator_mel_loss=28.46, generator_kl_loss=2.023, generator_dur_loss=1.738, generator_adv_loss=1.994, generator_feat_match_loss=2.361, over 53.00 samples.], tot_loss[discriminator_loss=2.613, discriminator_real_loss=1.378, discriminator_fake_loss=1.235, generator_loss=37.27, generator_mel_loss=29.07, generator_kl_loss=2.001, generator_dur_loss=1.7, generator_adv_loss=2.013, generator_feat_match_loss=2.491, over 2103.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:45:47,518 INFO [train.py:845] (3/4) Start epoch 35 2024-02-22 11:49:18,681 INFO [train.py:845] (3/4) Start epoch 36 2024-02-22 11:49:56,855 INFO [train.py:471] (3/4) Epoch 36, batch 5, global_batch_idx: 1300, batch size: 55, loss[discriminator_loss=2.525, discriminator_real_loss=1.501, discriminator_fake_loss=1.024, generator_loss=37.45, generator_mel_loss=29.03, generator_kl_loss=2.028, generator_dur_loss=1.765, generator_adv_loss=1.953, generator_feat_match_loss=2.678, over 55.00 samples.], tot_loss[discriminator_loss=2.623, discriminator_real_loss=1.374, discriminator_fake_loss=1.249, generator_loss=37.41, generator_mel_loss=29.15, generator_kl_loss=2.006, generator_dur_loss=1.711, generator_adv_loss=2.01, generator_feat_match_loss=2.534, over 437.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:52:48,611 INFO [train.py:845] (3/4) Start epoch 37 2024-02-22 11:54:35,963 INFO [train.py:471] (3/4) Epoch 37, batch 18, global_batch_idx: 1350, batch size: 81, loss[discriminator_loss=2.557, discriminator_real_loss=1.043, discriminator_fake_loss=1.514, generator_loss=36.82, generator_mel_loss=28.44, generator_kl_loss=1.878, generator_dur_loss=1.693, generator_adv_loss=2.166, generator_feat_match_loss=2.635, over 81.00 samples.], tot_loss[discriminator_loss=2.651, discriminator_real_loss=1.384, discriminator_fake_loss=1.267, generator_loss=36.93, generator_mel_loss=28.79, generator_kl_loss=1.973, generator_dur_loss=1.711, generator_adv_loss=2.033, generator_feat_match_loss=2.424, over 1307.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:56:15,981 INFO [train.py:845] (3/4) Start epoch 38 2024-02-22 11:59:16,986 INFO [train.py:471] (3/4) Epoch 38, batch 31, global_batch_idx: 1400, batch size: 61, loss[discriminator_loss=2.586, discriminator_real_loss=1.034, discriminator_fake_loss=1.551, generator_loss=37.56, generator_mel_loss=28.92, generator_kl_loss=1.956, generator_dur_loss=1.704, generator_adv_loss=2.336, generator_feat_match_loss=2.646, over 61.00 samples.], tot_loss[discriminator_loss=2.628, discriminator_real_loss=1.381, discriminator_fake_loss=1.247, generator_loss=36.66, generator_mel_loss=28.53, generator_kl_loss=1.961, generator_dur_loss=1.701, generator_adv_loss=2.006, generator_feat_match_loss=2.463, over 2409.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 11:59:16,987 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 11:59:25,367 INFO [train.py:534] (3/4) Epoch 38, validation: discriminator_loss=2.544, discriminator_real_loss=1.505, discriminator_fake_loss=1.039, generator_loss=37.82, generator_mel_loss=29.34, generator_kl_loss=1.938, generator_dur_loss=1.725, generator_adv_loss=2.223, generator_feat_match_loss=2.589, over 100.00 samples. 2024-02-22 11:59:25,368 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28046MB 2024-02-22 11:59:49,939 INFO [train.py:845] (3/4) Start epoch 39 2024-02-22 12:03:20,055 INFO [train.py:845] (3/4) Start epoch 40 2024-02-22 12:04:18,759 INFO [train.py:471] (3/4) Epoch 40, batch 7, global_batch_idx: 1450, batch size: 53, loss[discriminator_loss=2.691, discriminator_real_loss=1.494, discriminator_fake_loss=1.196, generator_loss=36.01, generator_mel_loss=28.3, generator_kl_loss=1.899, generator_dur_loss=1.752, generator_adv_loss=1.809, generator_feat_match_loss=2.252, over 53.00 samples.], tot_loss[discriminator_loss=2.671, discriminator_real_loss=1.377, discriminator_fake_loss=1.294, generator_loss=35.89, generator_mel_loss=27.99, generator_kl_loss=1.967, generator_dur_loss=1.708, generator_adv_loss=1.993, generator_feat_match_loss=2.232, over 621.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:06:49,810 INFO [train.py:845] (3/4) Start epoch 41 2024-02-22 12:08:55,617 INFO [train.py:471] (3/4) Epoch 41, batch 20, global_batch_idx: 1500, batch size: 65, loss[discriminator_loss=2.645, discriminator_real_loss=1.514, discriminator_fake_loss=1.131, generator_loss=36.67, generator_mel_loss=28.65, generator_kl_loss=2.043, generator_dur_loss=1.714, generator_adv_loss=1.831, generator_feat_match_loss=2.438, over 65.00 samples.], tot_loss[discriminator_loss=2.68, discriminator_real_loss=1.408, discriminator_fake_loss=1.272, generator_loss=35.99, generator_mel_loss=28.04, generator_kl_loss=1.973, generator_dur_loss=1.712, generator_adv_loss=1.979, generator_feat_match_loss=2.278, over 1560.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:10:24,632 INFO [train.py:845] (3/4) Start epoch 42 2024-02-22 12:13:39,206 INFO [train.py:471] (3/4) Epoch 42, batch 33, global_batch_idx: 1550, batch size: 58, loss[discriminator_loss=2.729, discriminator_real_loss=1.338, discriminator_fake_loss=1.391, generator_loss=35.08, generator_mel_loss=27.18, generator_kl_loss=2.007, generator_dur_loss=1.712, generator_adv_loss=1.967, generator_feat_match_loss=2.219, over 58.00 samples.], tot_loss[discriminator_loss=2.695, discriminator_real_loss=1.388, discriminator_fake_loss=1.307, generator_loss=35.8, generator_mel_loss=27.87, generator_kl_loss=1.979, generator_dur_loss=1.71, generator_adv_loss=1.972, generator_feat_match_loss=2.274, over 2492.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:13:56,110 INFO [train.py:845] (3/4) Start epoch 43 2024-02-22 12:17:18,531 INFO [train.py:845] (3/4) Start epoch 44 2024-02-22 12:18:21,655 INFO [train.py:471] (3/4) Epoch 44, batch 9, global_batch_idx: 1600, batch size: 64, loss[discriminator_loss=2.805, discriminator_real_loss=1.568, discriminator_fake_loss=1.236, generator_loss=36.34, generator_mel_loss=28.72, generator_kl_loss=1.938, generator_dur_loss=1.716, generator_adv_loss=1.767, generator_feat_match_loss=2.207, over 64.00 samples.], tot_loss[discriminator_loss=2.745, discriminator_real_loss=1.419, discriminator_fake_loss=1.326, generator_loss=35.43, generator_mel_loss=27.72, generator_kl_loss=1.983, generator_dur_loss=1.715, generator_adv_loss=1.902, generator_feat_match_loss=2.109, over 755.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:18:21,657 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 12:18:30,085 INFO [train.py:534] (3/4) Epoch 44, validation: discriminator_loss=2.681, discriminator_real_loss=1.301, discriminator_fake_loss=1.381, generator_loss=36.55, generator_mel_loss=28.82, generator_kl_loss=2.043, generator_dur_loss=1.718, generator_adv_loss=1.761, generator_feat_match_loss=2.205, over 100.00 samples. 2024-02-22 12:18:30,086 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 12:20:49,812 INFO [train.py:845] (3/4) Start epoch 45 2024-02-22 12:22:59,808 INFO [train.py:471] (3/4) Epoch 45, batch 22, global_batch_idx: 1650, batch size: 79, loss[discriminator_loss=2.734, discriminator_real_loss=1.482, discriminator_fake_loss=1.252, generator_loss=36.34, generator_mel_loss=28.06, generator_kl_loss=2.012, generator_dur_loss=1.716, generator_adv_loss=2.113, generator_feat_match_loss=2.434, over 79.00 samples.], tot_loss[discriminator_loss=2.712, discriminator_real_loss=1.415, discriminator_fake_loss=1.296, generator_loss=35.14, generator_mel_loss=27.28, generator_kl_loss=2.007, generator_dur_loss=1.718, generator_adv_loss=1.946, generator_feat_match_loss=2.182, over 1665.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:24:14,747 INFO [train.py:845] (3/4) Start epoch 46 2024-02-22 12:27:39,531 INFO [train.py:471] (3/4) Epoch 46, batch 35, global_batch_idx: 1700, batch size: 52, loss[discriminator_loss=2.604, discriminator_real_loss=1.303, discriminator_fake_loss=1.301, generator_loss=34.43, generator_mel_loss=26.32, generator_kl_loss=1.991, generator_dur_loss=1.738, generator_adv_loss=2.053, generator_feat_match_loss=2.33, over 52.00 samples.], tot_loss[discriminator_loss=2.735, discriminator_real_loss=1.411, discriminator_fake_loss=1.325, generator_loss=34.52, generator_mel_loss=26.89, generator_kl_loss=1.997, generator_dur_loss=1.709, generator_adv_loss=1.897, generator_feat_match_loss=2.029, over 2702.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:27:44,189 INFO [train.py:845] (3/4) Start epoch 47 2024-02-22 12:31:12,235 INFO [train.py:845] (3/4) Start epoch 48 2024-02-22 12:32:24,203 INFO [train.py:471] (3/4) Epoch 48, batch 11, global_batch_idx: 1750, batch size: 60, loss[discriminator_loss=2.648, discriminator_real_loss=1.334, discriminator_fake_loss=1.314, generator_loss=34.76, generator_mel_loss=27.19, generator_kl_loss=1.942, generator_dur_loss=1.727, generator_adv_loss=1.819, generator_feat_match_loss=2.082, over 60.00 samples.], tot_loss[discriminator_loss=2.748, discriminator_real_loss=1.434, discriminator_fake_loss=1.314, generator_loss=34.61, generator_mel_loss=26.82, generator_kl_loss=2.06, generator_dur_loss=1.725, generator_adv_loss=1.904, generator_feat_match_loss=2.102, over 790.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:34:37,227 INFO [train.py:845] (3/4) Start epoch 49 2024-02-22 12:37:01,997 INFO [train.py:471] (3/4) Epoch 49, batch 24, global_batch_idx: 1800, batch size: 50, loss[discriminator_loss=2.711, discriminator_real_loss=1.386, discriminator_fake_loss=1.324, generator_loss=35.43, generator_mel_loss=27.21, generator_kl_loss=2.131, generator_dur_loss=1.748, generator_adv_loss=2.092, generator_feat_match_loss=2.258, over 50.00 samples.], tot_loss[discriminator_loss=2.733, discriminator_real_loss=1.428, discriminator_fake_loss=1.305, generator_loss=34.44, generator_mel_loss=26.71, generator_kl_loss=2.04, generator_dur_loss=1.715, generator_adv_loss=1.913, generator_feat_match_loss=2.063, over 1807.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:37:01,999 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 12:37:10,142 INFO [train.py:534] (3/4) Epoch 49, validation: discriminator_loss=2.597, discriminator_real_loss=1.504, discriminator_fake_loss=1.093, generator_loss=35.51, generator_mel_loss=27.49, generator_kl_loss=2.174, generator_dur_loss=1.713, generator_adv_loss=2.055, generator_feat_match_loss=2.071, over 100.00 samples. 2024-02-22 12:37:10,143 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 12:38:16,334 INFO [train.py:845] (3/4) Start epoch 50 2024-02-22 12:41:45,071 INFO [train.py:845] (3/4) Start epoch 51 2024-02-22 12:41:58,545 INFO [train.py:471] (3/4) Epoch 51, batch 0, global_batch_idx: 1850, batch size: 71, loss[discriminator_loss=3.002, discriminator_real_loss=1.242, discriminator_fake_loss=1.76, generator_loss=32.88, generator_mel_loss=25.19, generator_kl_loss=2.071, generator_dur_loss=1.707, generator_adv_loss=2.213, generator_feat_match_loss=1.702, over 71.00 samples.], tot_loss[discriminator_loss=3.002, discriminator_real_loss=1.242, discriminator_fake_loss=1.76, generator_loss=32.88, generator_mel_loss=25.19, generator_kl_loss=2.071, generator_dur_loss=1.707, generator_adv_loss=2.213, generator_feat_match_loss=1.702, over 71.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:45:13,342 INFO [train.py:845] (3/4) Start epoch 52 2024-02-22 12:46:33,545 INFO [train.py:471] (3/4) Epoch 52, batch 13, global_batch_idx: 1900, batch size: 50, loss[discriminator_loss=2.68, discriminator_real_loss=1.215, discriminator_fake_loss=1.464, generator_loss=34.83, generator_mel_loss=26.68, generator_kl_loss=2.046, generator_dur_loss=1.767, generator_adv_loss=2.047, generator_feat_match_loss=2.281, over 50.00 samples.], tot_loss[discriminator_loss=2.703, discriminator_real_loss=1.377, discriminator_fake_loss=1.326, generator_loss=34.41, generator_mel_loss=26.58, generator_kl_loss=2.066, generator_dur_loss=1.719, generator_adv_loss=1.867, generator_feat_match_loss=2.174, over 983.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:48:38,758 INFO [train.py:845] (3/4) Start epoch 53 2024-02-22 12:51:14,966 INFO [train.py:471] (3/4) Epoch 53, batch 26, global_batch_idx: 1950, batch size: 56, loss[discriminator_loss=2.758, discriminator_real_loss=1.672, discriminator_fake_loss=1.085, generator_loss=34.57, generator_mel_loss=26.67, generator_kl_loss=2.117, generator_dur_loss=1.744, generator_adv_loss=1.842, generator_feat_match_loss=2.197, over 56.00 samples.], tot_loss[discriminator_loss=2.708, discriminator_real_loss=1.397, discriminator_fake_loss=1.311, generator_loss=34.13, generator_mel_loss=26.3, generator_kl_loss=2.048, generator_dur_loss=1.726, generator_adv_loss=1.916, generator_feat_match_loss=2.14, over 1859.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 32.0 2024-02-22 12:52:07,325 INFO [train.py:845] (3/4) Start epoch 54 2024-02-22 12:55:30,274 INFO [train.py:845] (3/4) Start epoch 55 2024-02-22 12:55:52,230 INFO [train.py:471] (3/4) Epoch 55, batch 2, global_batch_idx: 2000, batch size: 73, loss[discriminator_loss=2.762, discriminator_real_loss=1.502, discriminator_fake_loss=1.26, generator_loss=33.36, generator_mel_loss=25.88, generator_kl_loss=1.988, generator_dur_loss=1.721, generator_adv_loss=1.736, generator_feat_match_loss=2.037, over 73.00 samples.], tot_loss[discriminator_loss=2.783, discriminator_real_loss=1.421, discriminator_fake_loss=1.361, generator_loss=33.94, generator_mel_loss=26.22, generator_kl_loss=2.057, generator_dur_loss=1.728, generator_adv_loss=1.847, generator_feat_match_loss=2.093, over 198.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2024-02-22 12:55:52,232 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 12:56:01,106 INFO [train.py:534] (3/4) Epoch 55, validation: discriminator_loss=2.742, discriminator_real_loss=1.243, discriminator_fake_loss=1.499, generator_loss=34.87, generator_mel_loss=27.47, generator_kl_loss=1.89, generator_dur_loss=1.71, generator_adv_loss=1.667, generator_feat_match_loss=2.139, over 100.00 samples. 2024-02-22 12:56:01,107 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 12:59:05,961 INFO [train.py:845] (3/4) Start epoch 56 2024-02-22 13:00:42,750 INFO [train.py:471] (3/4) Epoch 56, batch 15, global_batch_idx: 2050, batch size: 73, loss[discriminator_loss=2.779, discriminator_real_loss=1.325, discriminator_fake_loss=1.454, generator_loss=33.8, generator_mel_loss=26.14, generator_kl_loss=2.054, generator_dur_loss=1.684, generator_adv_loss=1.851, generator_feat_match_loss=2.072, over 73.00 samples.], tot_loss[discriminator_loss=2.722, discriminator_real_loss=1.405, discriminator_fake_loss=1.317, generator_loss=33.61, generator_mel_loss=25.84, generator_kl_loss=2.055, generator_dur_loss=1.72, generator_adv_loss=1.87, generator_feat_match_loss=2.129, over 1048.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2024-02-22 13:02:34,955 INFO [train.py:845] (3/4) Start epoch 57 2024-02-22 13:05:29,683 INFO [train.py:471] (3/4) Epoch 57, batch 28, global_batch_idx: 2100, batch size: 63, loss[discriminator_loss=2.688, discriminator_real_loss=1.168, discriminator_fake_loss=1.521, generator_loss=34.64, generator_mel_loss=26.4, generator_kl_loss=2.117, generator_dur_loss=1.719, generator_adv_loss=2.014, generator_feat_match_loss=2.393, over 63.00 samples.], tot_loss[discriminator_loss=2.726, discriminator_real_loss=1.4, discriminator_fake_loss=1.326, generator_loss=33.66, generator_mel_loss=25.79, generator_kl_loss=2.045, generator_dur_loss=1.716, generator_adv_loss=1.93, generator_feat_match_loss=2.178, over 2007.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2024-02-22 13:06:08,352 INFO [train.py:845] (3/4) Start epoch 58 2024-02-22 13:09:41,037 INFO [train.py:845] (3/4) Start epoch 59 2024-02-22 13:10:14,140 INFO [train.py:471] (3/4) Epoch 59, batch 4, global_batch_idx: 2150, batch size: 76, loss[discriminator_loss=2.746, discriminator_real_loss=1.726, discriminator_fake_loss=1.021, generator_loss=32.8, generator_mel_loss=25.23, generator_kl_loss=2.065, generator_dur_loss=1.714, generator_adv_loss=1.548, generator_feat_match_loss=2.242, over 76.00 samples.], tot_loss[discriminator_loss=2.707, discriminator_real_loss=1.503, discriminator_fake_loss=1.204, generator_loss=33.83, generator_mel_loss=25.71, generator_kl_loss=2.087, generator_dur_loss=1.722, generator_adv_loss=2.023, generator_feat_match_loss=2.289, over 356.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2024-02-22 13:13:11,861 INFO [train.py:845] (3/4) Start epoch 60 2024-02-22 13:15:04,493 INFO [train.py:471] (3/4) Epoch 60, batch 17, global_batch_idx: 2200, batch size: 60, loss[discriminator_loss=2.77, discriminator_real_loss=1.405, discriminator_fake_loss=1.365, generator_loss=32.45, generator_mel_loss=25.1, generator_kl_loss=2.012, generator_dur_loss=1.714, generator_adv_loss=1.775, generator_feat_match_loss=1.84, over 60.00 samples.], tot_loss[discriminator_loss=2.776, discriminator_real_loss=1.45, discriminator_fake_loss=1.326, generator_loss=33.43, generator_mel_loss=25.67, generator_kl_loss=2.037, generator_dur_loss=1.71, generator_adv_loss=1.94, generator_feat_match_loss=2.073, over 1203.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2024-02-22 13:15:04,495 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 13:15:13,047 INFO [train.py:534] (3/4) Epoch 60, validation: discriminator_loss=2.683, discriminator_real_loss=1.277, discriminator_fake_loss=1.406, generator_loss=34.43, generator_mel_loss=26.87, generator_kl_loss=2.029, generator_dur_loss=1.702, generator_adv_loss=1.714, generator_feat_match_loss=2.111, over 100.00 samples. 2024-02-22 13:15:13,048 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 13:16:51,423 INFO [train.py:845] (3/4) Start epoch 61 2024-02-22 13:19:57,291 INFO [train.py:471] (3/4) Epoch 61, batch 30, global_batch_idx: 2250, batch size: 52, loss[discriminator_loss=2.758, discriminator_real_loss=1.409, discriminator_fake_loss=1.35, generator_loss=32.26, generator_mel_loss=24.82, generator_kl_loss=2.024, generator_dur_loss=1.712, generator_adv_loss=1.76, generator_feat_match_loss=1.941, over 52.00 samples.], tot_loss[discriminator_loss=2.734, discriminator_real_loss=1.405, discriminator_fake_loss=1.329, generator_loss=33.07, generator_mel_loss=25.38, generator_kl_loss=2.009, generator_dur_loss=1.69, generator_adv_loss=1.903, generator_feat_match_loss=2.084, over 2223.00 samples.], cur_lr_g: 1.99e-04, cur_lr_d: 1.99e-04, grad_scale: 64.0 2024-02-22 13:20:24,022 INFO [train.py:845] (3/4) Start epoch 62 2024-02-22 13:23:58,811 INFO [train.py:845] (3/4) Start epoch 63 2024-02-22 13:24:49,427 INFO [train.py:471] (3/4) Epoch 63, batch 6, global_batch_idx: 2300, batch size: 52, loss[discriminator_loss=2.746, discriminator_real_loss=1.323, discriminator_fake_loss=1.424, generator_loss=32.6, generator_mel_loss=24.83, generator_kl_loss=1.936, generator_dur_loss=1.681, generator_adv_loss=2.143, generator_feat_match_loss=2.008, over 52.00 samples.], tot_loss[discriminator_loss=2.665, discriminator_real_loss=1.358, discriminator_fake_loss=1.307, generator_loss=33, generator_mel_loss=25.2, generator_kl_loss=1.971, generator_dur_loss=1.678, generator_adv_loss=1.904, generator_feat_match_loss=2.252, over 541.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 13:27:31,842 INFO [train.py:845] (3/4) Start epoch 64 2024-02-22 13:29:32,873 INFO [train.py:471] (3/4) Epoch 64, batch 19, global_batch_idx: 2350, batch size: 73, loss[discriminator_loss=2.73, discriminator_real_loss=1.549, discriminator_fake_loss=1.181, generator_loss=32.85, generator_mel_loss=25.08, generator_kl_loss=2.02, generator_dur_loss=1.64, generator_adv_loss=2.1, generator_feat_match_loss=2.01, over 73.00 samples.], tot_loss[discriminator_loss=2.736, discriminator_real_loss=1.409, discriminator_fake_loss=1.327, generator_loss=32.76, generator_mel_loss=25.16, generator_kl_loss=1.957, generator_dur_loss=1.659, generator_adv_loss=1.891, generator_feat_match_loss=2.091, over 1592.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 13:31:00,564 INFO [train.py:845] (3/4) Start epoch 65 2024-02-22 13:34:12,391 INFO [train.py:471] (3/4) Epoch 65, batch 32, global_batch_idx: 2400, batch size: 54, loss[discriminator_loss=2.666, discriminator_real_loss=1.471, discriminator_fake_loss=1.195, generator_loss=32.07, generator_mel_loss=24.45, generator_kl_loss=2.055, generator_dur_loss=1.685, generator_adv_loss=1.883, generator_feat_match_loss=1.996, over 54.00 samples.], tot_loss[discriminator_loss=2.766, discriminator_real_loss=1.428, discriminator_fake_loss=1.338, generator_loss=32.33, generator_mel_loss=24.83, generator_kl_loss=1.939, generator_dur_loss=1.658, generator_adv_loss=1.889, generator_feat_match_loss=2.017, over 2232.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 13:34:12,392 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 13:34:21,036 INFO [train.py:534] (3/4) Epoch 65, validation: discriminator_loss=2.655, discriminator_real_loss=1.323, discriminator_fake_loss=1.333, generator_loss=33.61, generator_mel_loss=26.13, generator_kl_loss=1.947, generator_dur_loss=1.639, generator_adv_loss=1.815, generator_feat_match_loss=2.078, over 100.00 samples. 2024-02-22 13:34:21,037 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 13:34:38,693 INFO [train.py:845] (3/4) Start epoch 66 2024-02-22 13:38:08,040 INFO [train.py:845] (3/4) Start epoch 67 2024-02-22 13:39:01,401 INFO [train.py:471] (3/4) Epoch 67, batch 8, global_batch_idx: 2450, batch size: 63, loss[discriminator_loss=2.787, discriminator_real_loss=1.377, discriminator_fake_loss=1.41, generator_loss=33.19, generator_mel_loss=25.6, generator_kl_loss=1.923, generator_dur_loss=1.659, generator_adv_loss=1.876, generator_feat_match_loss=2.133, over 63.00 samples.], tot_loss[discriminator_loss=2.725, discriminator_real_loss=1.374, discriminator_fake_loss=1.351, generator_loss=32.66, generator_mel_loss=24.99, generator_kl_loss=1.947, generator_dur_loss=1.651, generator_adv_loss=1.925, generator_feat_match_loss=2.147, over 540.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 13:41:41,732 INFO [train.py:845] (3/4) Start epoch 68 2024-02-22 13:43:55,920 INFO [train.py:471] (3/4) Epoch 68, batch 21, global_batch_idx: 2500, batch size: 85, loss[discriminator_loss=2.691, discriminator_real_loss=1.359, discriminator_fake_loss=1.331, generator_loss=32.09, generator_mel_loss=24.53, generator_kl_loss=1.929, generator_dur_loss=1.615, generator_adv_loss=1.867, generator_feat_match_loss=2.146, over 85.00 samples.], tot_loss[discriminator_loss=2.778, discriminator_real_loss=1.42, discriminator_fake_loss=1.359, generator_loss=32.36, generator_mel_loss=24.88, generator_kl_loss=1.928, generator_dur_loss=1.635, generator_adv_loss=1.879, generator_feat_match_loss=2.04, over 1613.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 13:45:14,360 INFO [train.py:845] (3/4) Start epoch 69 2024-02-22 13:48:33,765 INFO [train.py:471] (3/4) Epoch 69, batch 34, global_batch_idx: 2550, batch size: 50, loss[discriminator_loss=2.777, discriminator_real_loss=1.59, discriminator_fake_loss=1.188, generator_loss=30.97, generator_mel_loss=23.74, generator_kl_loss=1.872, generator_dur_loss=1.632, generator_adv_loss=1.783, generator_feat_match_loss=1.941, over 50.00 samples.], tot_loss[discriminator_loss=2.769, discriminator_real_loss=1.409, discriminator_fake_loss=1.36, generator_loss=32.29, generator_mel_loss=24.75, generator_kl_loss=1.948, generator_dur_loss=1.622, generator_adv_loss=1.89, generator_feat_match_loss=2.085, over 2874.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 13:48:44,260 INFO [train.py:845] (3/4) Start epoch 70 2024-02-22 13:52:04,588 INFO [train.py:845] (3/4) Start epoch 71 2024-02-22 13:53:11,316 INFO [train.py:471] (3/4) Epoch 71, batch 10, global_batch_idx: 2600, batch size: 53, loss[discriminator_loss=2.676, discriminator_real_loss=1.565, discriminator_fake_loss=1.109, generator_loss=31.98, generator_mel_loss=24.37, generator_kl_loss=1.981, generator_dur_loss=1.63, generator_adv_loss=1.706, generator_feat_match_loss=2.293, over 53.00 samples.], tot_loss[discriminator_loss=2.724, discriminator_real_loss=1.401, discriminator_fake_loss=1.323, generator_loss=32.45, generator_mel_loss=24.79, generator_kl_loss=1.937, generator_dur_loss=1.628, generator_adv_loss=1.923, generator_feat_match_loss=2.175, over 837.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 13:53:11,318 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 13:53:19,568 INFO [train.py:534] (3/4) Epoch 71, validation: discriminator_loss=2.614, discriminator_real_loss=1.109, discriminator_fake_loss=1.505, generator_loss=33.16, generator_mel_loss=25.5, generator_kl_loss=1.929, generator_dur_loss=1.613, generator_adv_loss=1.652, generator_feat_match_loss=2.469, over 100.00 samples. 2024-02-22 13:53:19,569 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 13:55:36,865 INFO [train.py:845] (3/4) Start epoch 72 2024-02-22 13:58:01,615 INFO [train.py:471] (3/4) Epoch 72, batch 23, global_batch_idx: 2650, batch size: 110, loss[discriminator_loss=2.689, discriminator_real_loss=1.326, discriminator_fake_loss=1.363, generator_loss=32.36, generator_mel_loss=24.82, generator_kl_loss=2.034, generator_dur_loss=1.601, generator_adv_loss=1.723, generator_feat_match_loss=2.188, over 110.00 samples.], tot_loss[discriminator_loss=2.75, discriminator_real_loss=1.407, discriminator_fake_loss=1.344, generator_loss=32.08, generator_mel_loss=24.53, generator_kl_loss=1.934, generator_dur_loss=1.613, generator_adv_loss=1.9, generator_feat_match_loss=2.099, over 1762.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 13:59:05,108 INFO [train.py:845] (3/4) Start epoch 73 2024-02-22 14:02:35,894 INFO [train.py:471] (3/4) Epoch 73, batch 36, global_batch_idx: 2700, batch size: 61, loss[discriminator_loss=2.789, discriminator_real_loss=1.226, discriminator_fake_loss=1.564, generator_loss=31.23, generator_mel_loss=23.73, generator_kl_loss=1.916, generator_dur_loss=1.623, generator_adv_loss=1.979, generator_feat_match_loss=1.985, over 61.00 samples.], tot_loss[discriminator_loss=2.716, discriminator_real_loss=1.398, discriminator_fake_loss=1.318, generator_loss=32.04, generator_mel_loss=24.44, generator_kl_loss=1.947, generator_dur_loss=1.613, generator_adv_loss=1.891, generator_feat_match_loss=2.148, over 2686.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:02:36,415 INFO [train.py:845] (3/4) Start epoch 74 2024-02-22 14:06:09,509 INFO [train.py:845] (3/4) Start epoch 75 2024-02-22 14:07:25,717 INFO [train.py:471] (3/4) Epoch 75, batch 12, global_batch_idx: 2750, batch size: 60, loss[discriminator_loss=2.68, discriminator_real_loss=1.255, discriminator_fake_loss=1.424, generator_loss=32.39, generator_mel_loss=24.56, generator_kl_loss=1.919, generator_dur_loss=1.62, generator_adv_loss=1.982, generator_feat_match_loss=2.309, over 60.00 samples.], tot_loss[discriminator_loss=2.717, discriminator_real_loss=1.368, discriminator_fake_loss=1.35, generator_loss=32.43, generator_mel_loss=24.67, generator_kl_loss=1.935, generator_dur_loss=1.613, generator_adv_loss=1.963, generator_feat_match_loss=2.255, over 936.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:09:32,530 INFO [train.py:845] (3/4) Start epoch 76 2024-02-22 14:12:01,232 INFO [train.py:471] (3/4) Epoch 76, batch 25, global_batch_idx: 2800, batch size: 63, loss[discriminator_loss=2.756, discriminator_real_loss=1.557, discriminator_fake_loss=1.199, generator_loss=31.51, generator_mel_loss=24.13, generator_kl_loss=1.927, generator_dur_loss=1.628, generator_adv_loss=1.768, generator_feat_match_loss=2.055, over 63.00 samples.], tot_loss[discriminator_loss=2.746, discriminator_real_loss=1.407, discriminator_fake_loss=1.339, generator_loss=32, generator_mel_loss=24.48, generator_kl_loss=1.942, generator_dur_loss=1.607, generator_adv_loss=1.895, generator_feat_match_loss=2.081, over 1754.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:12:01,233 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 14:12:09,536 INFO [train.py:534] (3/4) Epoch 76, validation: discriminator_loss=2.756, discriminator_real_loss=1.335, discriminator_fake_loss=1.421, generator_loss=33.14, generator_mel_loss=25.57, generator_kl_loss=1.976, generator_dur_loss=1.591, generator_adv_loss=1.757, generator_feat_match_loss=2.245, over 100.00 samples. 2024-02-22 14:12:09,536 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 14:13:09,417 INFO [train.py:845] (3/4) Start epoch 77 2024-02-22 14:16:35,670 INFO [train.py:845] (3/4) Start epoch 78 2024-02-22 14:16:54,893 INFO [train.py:471] (3/4) Epoch 78, batch 1, global_batch_idx: 2850, batch size: 101, loss[discriminator_loss=2.787, discriminator_real_loss=1.383, discriminator_fake_loss=1.404, generator_loss=32.12, generator_mel_loss=24.33, generator_kl_loss=1.913, generator_dur_loss=1.574, generator_adv_loss=2.113, generator_feat_match_loss=2.184, over 101.00 samples.], tot_loss[discriminator_loss=2.753, discriminator_real_loss=1.414, discriminator_fake_loss=1.34, generator_loss=32.17, generator_mel_loss=24.44, generator_kl_loss=1.93, generator_dur_loss=1.591, generator_adv_loss=1.985, generator_feat_match_loss=2.22, over 186.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:20:02,340 INFO [train.py:845] (3/4) Start epoch 79 2024-02-22 14:21:28,359 INFO [train.py:471] (3/4) Epoch 79, batch 14, global_batch_idx: 2900, batch size: 79, loss[discriminator_loss=2.861, discriminator_real_loss=1.555, discriminator_fake_loss=1.307, generator_loss=31.67, generator_mel_loss=24.24, generator_kl_loss=1.984, generator_dur_loss=1.589, generator_adv_loss=1.702, generator_feat_match_loss=2.156, over 79.00 samples.], tot_loss[discriminator_loss=2.742, discriminator_real_loss=1.406, discriminator_fake_loss=1.336, generator_loss=31.9, generator_mel_loss=24.26, generator_kl_loss=1.939, generator_dur_loss=1.599, generator_adv_loss=1.916, generator_feat_match_loss=2.182, over 1006.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:23:31,855 INFO [train.py:845] (3/4) Start epoch 80 2024-02-22 14:26:12,962 INFO [train.py:471] (3/4) Epoch 80, batch 27, global_batch_idx: 2950, batch size: 52, loss[discriminator_loss=2.684, discriminator_real_loss=1.4, discriminator_fake_loss=1.284, generator_loss=30.47, generator_mel_loss=22.98, generator_kl_loss=2.039, generator_dur_loss=1.602, generator_adv_loss=1.775, generator_feat_match_loss=2.072, over 52.00 samples.], tot_loss[discriminator_loss=2.728, discriminator_real_loss=1.41, discriminator_fake_loss=1.318, generator_loss=32.06, generator_mel_loss=24.37, generator_kl_loss=1.948, generator_dur_loss=1.598, generator_adv_loss=1.909, generator_feat_match_loss=2.239, over 2126.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:26:56,132 INFO [train.py:845] (3/4) Start epoch 81 2024-02-22 14:30:29,739 INFO [train.py:845] (3/4) Start epoch 82 2024-02-22 14:30:58,416 INFO [train.py:471] (3/4) Epoch 82, batch 3, global_batch_idx: 3000, batch size: 52, loss[discriminator_loss=2.793, discriminator_real_loss=1.382, discriminator_fake_loss=1.411, generator_loss=31.06, generator_mel_loss=23.5, generator_kl_loss=1.971, generator_dur_loss=1.605, generator_adv_loss=1.91, generator_feat_match_loss=2.076, over 52.00 samples.], tot_loss[discriminator_loss=2.725, discriminator_real_loss=1.445, discriminator_fake_loss=1.28, generator_loss=31.68, generator_mel_loss=24.01, generator_kl_loss=1.96, generator_dur_loss=1.584, generator_adv_loss=1.891, generator_feat_match_loss=2.234, over 284.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:30:58,418 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 14:31:07,212 INFO [train.py:534] (3/4) Epoch 82, validation: discriminator_loss=2.727, discriminator_real_loss=1.361, discriminator_fake_loss=1.366, generator_loss=32.9, generator_mel_loss=25.12, generator_kl_loss=1.994, generator_dur_loss=1.576, generator_adv_loss=1.835, generator_feat_match_loss=2.378, over 100.00 samples. 2024-02-22 14:31:07,213 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 14:34:06,870 INFO [train.py:845] (3/4) Start epoch 83 2024-02-22 14:35:51,157 INFO [train.py:471] (3/4) Epoch 83, batch 16, global_batch_idx: 3050, batch size: 85, loss[discriminator_loss=2.73, discriminator_real_loss=1.237, discriminator_fake_loss=1.492, generator_loss=31.72, generator_mel_loss=24.02, generator_kl_loss=1.987, generator_dur_loss=1.57, generator_adv_loss=1.973, generator_feat_match_loss=2.168, over 85.00 samples.], tot_loss[discriminator_loss=2.743, discriminator_real_loss=1.41, discriminator_fake_loss=1.333, generator_loss=31.85, generator_mel_loss=24.15, generator_kl_loss=1.957, generator_dur_loss=1.589, generator_adv_loss=1.9, generator_feat_match_loss=2.247, over 1398.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:37:41,560 INFO [train.py:845] (3/4) Start epoch 84 2024-02-22 14:40:26,958 INFO [train.py:471] (3/4) Epoch 84, batch 29, global_batch_idx: 3100, batch size: 63, loss[discriminator_loss=2.65, discriminator_real_loss=1.387, discriminator_fake_loss=1.264, generator_loss=31.65, generator_mel_loss=23.91, generator_kl_loss=1.87, generator_dur_loss=1.586, generator_adv_loss=1.835, generator_feat_match_loss=2.445, over 63.00 samples.], tot_loss[discriminator_loss=2.709, discriminator_real_loss=1.381, discriminator_fake_loss=1.327, generator_loss=31.39, generator_mel_loss=23.75, generator_kl_loss=1.933, generator_dur_loss=1.594, generator_adv_loss=1.898, generator_feat_match_loss=2.22, over 1950.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:41:09,852 INFO [train.py:845] (3/4) Start epoch 85 2024-02-22 14:44:35,831 INFO [train.py:845] (3/4) Start epoch 86 2024-02-22 14:45:12,229 INFO [train.py:471] (3/4) Epoch 86, batch 5, global_batch_idx: 3150, batch size: 81, loss[discriminator_loss=2.689, discriminator_real_loss=1.494, discriminator_fake_loss=1.195, generator_loss=31.41, generator_mel_loss=23.92, generator_kl_loss=2.022, generator_dur_loss=1.605, generator_adv_loss=1.739, generator_feat_match_loss=2.121, over 81.00 samples.], tot_loss[discriminator_loss=2.728, discriminator_real_loss=1.422, discriminator_fake_loss=1.306, generator_loss=31.41, generator_mel_loss=23.99, generator_kl_loss=1.952, generator_dur_loss=1.606, generator_adv_loss=1.828, generator_feat_match_loss=2.034, over 392.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:48:04,876 INFO [train.py:845] (3/4) Start epoch 87 2024-02-22 14:49:58,502 INFO [train.py:471] (3/4) Epoch 87, batch 18, global_batch_idx: 3200, batch size: 65, loss[discriminator_loss=2.742, discriminator_real_loss=1.569, discriminator_fake_loss=1.172, generator_loss=30.75, generator_mel_loss=23.21, generator_kl_loss=1.975, generator_dur_loss=1.591, generator_adv_loss=1.86, generator_feat_match_loss=2.115, over 65.00 samples.], tot_loss[discriminator_loss=2.706, discriminator_real_loss=1.378, discriminator_fake_loss=1.328, generator_loss=31.41, generator_mel_loss=23.76, generator_kl_loss=1.943, generator_dur_loss=1.584, generator_adv_loss=1.884, generator_feat_match_loss=2.232, over 1450.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:49:58,503 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 14:50:06,505 INFO [train.py:534] (3/4) Epoch 87, validation: discriminator_loss=2.679, discriminator_real_loss=1.362, discriminator_fake_loss=1.317, generator_loss=32.39, generator_mel_loss=24.82, generator_kl_loss=2.001, generator_dur_loss=1.584, generator_adv_loss=1.783, generator_feat_match_loss=2.204, over 100.00 samples. 2024-02-22 14:50:06,505 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 14:51:44,570 INFO [train.py:845] (3/4) Start epoch 88 2024-02-22 14:54:41,936 INFO [train.py:471] (3/4) Epoch 88, batch 31, global_batch_idx: 3250, batch size: 52, loss[discriminator_loss=2.754, discriminator_real_loss=1.403, discriminator_fake_loss=1.352, generator_loss=31.87, generator_mel_loss=24.04, generator_kl_loss=1.749, generator_dur_loss=1.602, generator_adv_loss=2.316, generator_feat_match_loss=2.166, over 52.00 samples.], tot_loss[discriminator_loss=2.729, discriminator_real_loss=1.409, discriminator_fake_loss=1.32, generator_loss=31.45, generator_mel_loss=23.77, generator_kl_loss=1.941, generator_dur_loss=1.592, generator_adv_loss=1.922, generator_feat_match_loss=2.22, over 2147.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 14:55:12,468 INFO [train.py:845] (3/4) Start epoch 89 2024-02-22 14:58:44,809 INFO [train.py:845] (3/4) Start epoch 90 2024-02-22 14:59:38,108 INFO [train.py:471] (3/4) Epoch 90, batch 7, global_batch_idx: 3300, batch size: 52, loss[discriminator_loss=2.797, discriminator_real_loss=1.477, discriminator_fake_loss=1.32, generator_loss=31.44, generator_mel_loss=23.97, generator_kl_loss=1.964, generator_dur_loss=1.605, generator_adv_loss=1.829, generator_feat_match_loss=2.076, over 52.00 samples.], tot_loss[discriminator_loss=2.708, discriminator_real_loss=1.382, discriminator_fake_loss=1.326, generator_loss=31.88, generator_mel_loss=24.11, generator_kl_loss=1.932, generator_dur_loss=1.591, generator_adv_loss=1.938, generator_feat_match_loss=2.301, over 571.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 15:02:19,752 INFO [train.py:845] (3/4) Start epoch 91 2024-02-22 15:04:21,622 INFO [train.py:471] (3/4) Epoch 91, batch 20, global_batch_idx: 3350, batch size: 59, loss[discriminator_loss=2.717, discriminator_real_loss=1.373, discriminator_fake_loss=1.344, generator_loss=31.18, generator_mel_loss=23.83, generator_kl_loss=1.88, generator_dur_loss=1.601, generator_adv_loss=1.826, generator_feat_match_loss=2.045, over 59.00 samples.], tot_loss[discriminator_loss=2.754, discriminator_real_loss=1.404, discriminator_fake_loss=1.349, generator_loss=31.37, generator_mel_loss=23.76, generator_kl_loss=1.927, generator_dur_loss=1.585, generator_adv_loss=1.898, generator_feat_match_loss=2.195, over 1519.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 15:05:44,018 INFO [train.py:845] (3/4) Start epoch 92 2024-02-22 15:08:48,644 INFO [train.py:471] (3/4) Epoch 92, batch 33, global_batch_idx: 3400, batch size: 51, loss[discriminator_loss=2.732, discriminator_real_loss=1.455, discriminator_fake_loss=1.277, generator_loss=30.19, generator_mel_loss=22.66, generator_kl_loss=1.911, generator_dur_loss=1.589, generator_adv_loss=1.889, generator_feat_match_loss=2.143, over 51.00 samples.], tot_loss[discriminator_loss=2.721, discriminator_real_loss=1.398, discriminator_fake_loss=1.323, generator_loss=31.31, generator_mel_loss=23.67, generator_kl_loss=1.944, generator_dur_loss=1.584, generator_adv_loss=1.883, generator_feat_match_loss=2.231, over 2384.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 15:08:48,646 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 15:08:57,065 INFO [train.py:534] (3/4) Epoch 92, validation: discriminator_loss=2.641, discriminator_real_loss=1.283, discriminator_fake_loss=1.359, generator_loss=32.14, generator_mel_loss=24.39, generator_kl_loss=1.946, generator_dur_loss=1.579, generator_adv_loss=1.815, generator_feat_match_loss=2.407, over 100.00 samples. 2024-02-22 15:08:57,065 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28047MB 2024-02-22 15:09:14,999 INFO [train.py:845] (3/4) Start epoch 93 2024-02-22 15:12:46,210 INFO [train.py:845] (3/4) Start epoch 94 2024-02-22 15:13:46,466 INFO [train.py:471] (3/4) Epoch 94, batch 9, global_batch_idx: 3450, batch size: 64, loss[discriminator_loss=2.713, discriminator_real_loss=1.332, discriminator_fake_loss=1.381, generator_loss=32.43, generator_mel_loss=24.68, generator_kl_loss=1.885, generator_dur_loss=1.581, generator_adv_loss=1.9, generator_feat_match_loss=2.389, over 64.00 samples.], tot_loss[discriminator_loss=2.703, discriminator_real_loss=1.348, discriminator_fake_loss=1.354, generator_loss=31.45, generator_mel_loss=23.82, generator_kl_loss=1.915, generator_dur_loss=1.585, generator_adv_loss=1.864, generator_feat_match_loss=2.264, over 689.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 15:16:14,769 INFO [train.py:845] (3/4) Start epoch 95 2024-02-22 15:18:36,053 INFO [train.py:471] (3/4) Epoch 95, batch 22, global_batch_idx: 3500, batch size: 90, loss[discriminator_loss=2.801, discriminator_real_loss=1.26, discriminator_fake_loss=1.541, generator_loss=31.62, generator_mel_loss=24.12, generator_kl_loss=1.995, generator_dur_loss=1.594, generator_adv_loss=1.875, generator_feat_match_loss=2.039, over 90.00 samples.], tot_loss[discriminator_loss=2.748, discriminator_real_loss=1.39, discriminator_fake_loss=1.358, generator_loss=31.14, generator_mel_loss=23.58, generator_kl_loss=1.925, generator_dur_loss=1.585, generator_adv_loss=1.883, generator_feat_match_loss=2.163, over 1591.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 15:19:47,984 INFO [train.py:845] (3/4) Start epoch 96 2024-02-22 15:23:02,028 INFO [train.py:471] (3/4) Epoch 96, batch 35, global_batch_idx: 3550, batch size: 52, loss[discriminator_loss=2.801, discriminator_real_loss=1.619, discriminator_fake_loss=1.182, generator_loss=30.83, generator_mel_loss=23.48, generator_kl_loss=2.011, generator_dur_loss=1.584, generator_adv_loss=1.593, generator_feat_match_loss=2.162, over 52.00 samples.], tot_loss[discriminator_loss=2.756, discriminator_real_loss=1.425, discriminator_fake_loss=1.331, generator_loss=31.22, generator_mel_loss=23.6, generator_kl_loss=1.925, generator_dur_loss=1.58, generator_adv_loss=1.909, generator_feat_match_loss=2.205, over 2607.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 15:23:08,360 INFO [train.py:845] (3/4) Start epoch 97 2024-02-22 15:26:36,301 INFO [train.py:845] (3/4) Start epoch 98 2024-02-22 15:27:52,678 INFO [train.py:471] (3/4) Epoch 98, batch 11, global_batch_idx: 3600, batch size: 54, loss[discriminator_loss=2.719, discriminator_real_loss=1.237, discriminator_fake_loss=1.482, generator_loss=31.86, generator_mel_loss=24.08, generator_kl_loss=1.946, generator_dur_loss=1.593, generator_adv_loss=2.031, generator_feat_match_loss=2.207, over 54.00 samples.], tot_loss[discriminator_loss=2.782, discriminator_real_loss=1.405, discriminator_fake_loss=1.376, generator_loss=31.23, generator_mel_loss=23.59, generator_kl_loss=1.938, generator_dur_loss=1.588, generator_adv_loss=1.935, generator_feat_match_loss=2.174, over 731.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 15:27:52,680 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 15:28:01,168 INFO [train.py:534] (3/4) Epoch 98, validation: discriminator_loss=2.697, discriminator_real_loss=1.464, discriminator_fake_loss=1.232, generator_loss=33.15, generator_mel_loss=25.32, generator_kl_loss=1.94, generator_dur_loss=1.572, generator_adv_loss=2.002, generator_feat_match_loss=2.312, over 100.00 samples. 2024-02-22 15:28:01,169 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 15:30:10,932 INFO [train.py:845] (3/4) Start epoch 99 2024-02-22 15:32:36,400 INFO [train.py:471] (3/4) Epoch 99, batch 24, global_batch_idx: 3650, batch size: 69, loss[discriminator_loss=2.729, discriminator_real_loss=1.456, discriminator_fake_loss=1.272, generator_loss=31.2, generator_mel_loss=23.34, generator_kl_loss=1.979, generator_dur_loss=1.588, generator_adv_loss=2.049, generator_feat_match_loss=2.242, over 69.00 samples.], tot_loss[discriminator_loss=2.739, discriminator_real_loss=1.398, discriminator_fake_loss=1.341, generator_loss=30.99, generator_mel_loss=23.5, generator_kl_loss=1.925, generator_dur_loss=1.582, generator_adv_loss=1.869, generator_feat_match_loss=2.113, over 1636.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 15:33:41,986 INFO [train.py:845] (3/4) Start epoch 100 2024-02-22 15:37:16,288 INFO [train.py:845] (3/4) Start epoch 101 2024-02-22 15:37:29,598 INFO [train.py:471] (3/4) Epoch 101, batch 0, global_batch_idx: 3700, batch size: 53, loss[discriminator_loss=2.748, discriminator_real_loss=1.416, discriminator_fake_loss=1.332, generator_loss=30.37, generator_mel_loss=22.85, generator_kl_loss=1.831, generator_dur_loss=1.601, generator_adv_loss=1.896, generator_feat_match_loss=2.188, over 53.00 samples.], tot_loss[discriminator_loss=2.748, discriminator_real_loss=1.416, discriminator_fake_loss=1.332, generator_loss=30.37, generator_mel_loss=22.85, generator_kl_loss=1.831, generator_dur_loss=1.601, generator_adv_loss=1.896, generator_feat_match_loss=2.188, over 53.00 samples.], cur_lr_g: 1.98e-04, cur_lr_d: 1.98e-04, grad_scale: 64.0 2024-02-22 15:40:44,876 INFO [train.py:845] (3/4) Start epoch 102 2024-02-22 15:42:01,469 INFO [train.py:471] (3/4) Epoch 102, batch 13, global_batch_idx: 3750, batch size: 67, loss[discriminator_loss=2.793, discriminator_real_loss=1.514, discriminator_fake_loss=1.279, generator_loss=30.4, generator_mel_loss=22.87, generator_kl_loss=1.954, generator_dur_loss=1.585, generator_adv_loss=1.835, generator_feat_match_loss=2.152, over 67.00 samples.], tot_loss[discriminator_loss=2.747, discriminator_real_loss=1.402, discriminator_fake_loss=1.345, generator_loss=30.92, generator_mel_loss=23.22, generator_kl_loss=1.934, generator_dur_loss=1.578, generator_adv_loss=1.919, generator_feat_match_loss=2.269, over 989.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2024-02-22 15:44:17,611 INFO [train.py:845] (3/4) Start epoch 103 2024-02-22 15:46:38,226 INFO [train.py:471] (3/4) Epoch 103, batch 26, global_batch_idx: 3800, batch size: 55, loss[discriminator_loss=2.744, discriminator_real_loss=1.353, discriminator_fake_loss=1.392, generator_loss=30.05, generator_mel_loss=22.9, generator_kl_loss=1.825, generator_dur_loss=1.6, generator_adv_loss=1.731, generator_feat_match_loss=1.99, over 55.00 samples.], tot_loss[discriminator_loss=2.737, discriminator_real_loss=1.383, discriminator_fake_loss=1.354, generator_loss=31.13, generator_mel_loss=23.53, generator_kl_loss=1.939, generator_dur_loss=1.579, generator_adv_loss=1.872, generator_feat_match_loss=2.216, over 1791.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2024-02-22 15:46:38,228 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 15:46:46,723 INFO [train.py:534] (3/4) Epoch 103, validation: discriminator_loss=2.676, discriminator_real_loss=1.214, discriminator_fake_loss=1.462, generator_loss=31.6, generator_mel_loss=24.15, generator_kl_loss=1.974, generator_dur_loss=1.566, generator_adv_loss=1.688, generator_feat_match_loss=2.218, over 100.00 samples. 2024-02-22 15:46:46,724 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 15:47:44,844 INFO [train.py:845] (3/4) Start epoch 104 2024-02-22 15:51:13,832 INFO [train.py:845] (3/4) Start epoch 105 2024-02-22 15:51:42,449 INFO [train.py:471] (3/4) Epoch 105, batch 2, global_batch_idx: 3850, batch size: 52, loss[discriminator_loss=2.758, discriminator_real_loss=1.302, discriminator_fake_loss=1.455, generator_loss=30.81, generator_mel_loss=23.01, generator_kl_loss=2.017, generator_dur_loss=1.589, generator_adv_loss=2.08, generator_feat_match_loss=2.111, over 52.00 samples.], tot_loss[discriminator_loss=2.709, discriminator_real_loss=1.35, discriminator_fake_loss=1.359, generator_loss=31.11, generator_mel_loss=23.33, generator_kl_loss=1.976, generator_dur_loss=1.589, generator_adv_loss=1.912, generator_feat_match_loss=2.302, over 179.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2024-02-22 15:54:39,723 INFO [train.py:845] (3/4) Start epoch 106 2024-02-22 15:56:07,187 INFO [train.py:471] (3/4) Epoch 106, batch 15, global_batch_idx: 3900, batch size: 67, loss[discriminator_loss=2.803, discriminator_real_loss=1.263, discriminator_fake_loss=1.54, generator_loss=30.94, generator_mel_loss=23.21, generator_kl_loss=1.95, generator_dur_loss=1.575, generator_adv_loss=2.055, generator_feat_match_loss=2.152, over 67.00 samples.], tot_loss[discriminator_loss=2.742, discriminator_real_loss=1.397, discriminator_fake_loss=1.345, generator_loss=31.18, generator_mel_loss=23.54, generator_kl_loss=1.95, generator_dur_loss=1.576, generator_adv_loss=1.878, generator_feat_match_loss=2.228, over 1165.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2024-02-22 15:58:03,530 INFO [train.py:845] (3/4) Start epoch 107 2024-02-22 16:00:40,291 INFO [train.py:471] (3/4) Epoch 107, batch 28, global_batch_idx: 3950, batch size: 153, loss[discriminator_loss=2.771, discriminator_real_loss=1.449, discriminator_fake_loss=1.322, generator_loss=30.86, generator_mel_loss=23.19, generator_kl_loss=1.885, generator_dur_loss=1.549, generator_adv_loss=1.962, generator_feat_match_loss=2.273, over 153.00 samples.], tot_loss[discriminator_loss=2.735, discriminator_real_loss=1.408, discriminator_fake_loss=1.327, generator_loss=30.98, generator_mel_loss=23.29, generator_kl_loss=1.926, generator_dur_loss=1.575, generator_adv_loss=1.898, generator_feat_match_loss=2.291, over 2116.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 64.0 2024-02-22 16:01:33,746 INFO [train.py:845] (3/4) Start epoch 108 2024-02-22 16:04:59,687 INFO [train.py:845] (3/4) Start epoch 109 2024-02-22 16:05:33,239 INFO [train.py:471] (3/4) Epoch 109, batch 4, global_batch_idx: 4000, batch size: 85, loss[discriminator_loss=2.762, discriminator_real_loss=1.362, discriminator_fake_loss=1.4, generator_loss=30.75, generator_mel_loss=23.19, generator_kl_loss=1.969, generator_dur_loss=1.562, generator_adv_loss=1.864, generator_feat_match_loss=2.164, over 85.00 samples.], tot_loss[discriminator_loss=2.703, discriminator_real_loss=1.397, discriminator_fake_loss=1.306, generator_loss=30.88, generator_mel_loss=23.24, generator_kl_loss=1.944, generator_dur_loss=1.564, generator_adv_loss=1.884, generator_feat_match_loss=2.247, over 444.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:05:33,240 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 16:05:41,879 INFO [train.py:534] (3/4) Epoch 109, validation: discriminator_loss=2.664, discriminator_real_loss=1.247, discriminator_fake_loss=1.417, generator_loss=31.61, generator_mel_loss=24.09, generator_kl_loss=2.004, generator_dur_loss=1.566, generator_adv_loss=1.779, generator_feat_match_loss=2.168, over 100.00 samples. 2024-02-22 16:05:41,881 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 16:08:40,748 INFO [train.py:845] (3/4) Start epoch 110 2024-02-22 16:10:24,762 INFO [train.py:471] (3/4) Epoch 110, batch 17, global_batch_idx: 4050, batch size: 65, loss[discriminator_loss=2.68, discriminator_real_loss=1.156, discriminator_fake_loss=1.522, generator_loss=30.62, generator_mel_loss=22.73, generator_kl_loss=1.924, generator_dur_loss=1.557, generator_adv_loss=2.184, generator_feat_match_loss=2.221, over 65.00 samples.], tot_loss[discriminator_loss=2.715, discriminator_real_loss=1.357, discriminator_fake_loss=1.358, generator_loss=30.63, generator_mel_loss=23.01, generator_kl_loss=1.904, generator_dur_loss=1.571, generator_adv_loss=1.891, generator_feat_match_loss=2.257, over 1223.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:12:10,290 INFO [train.py:845] (3/4) Start epoch 111 2024-02-22 16:15:10,974 INFO [train.py:471] (3/4) Epoch 111, batch 30, global_batch_idx: 4100, batch size: 56, loss[discriminator_loss=2.689, discriminator_real_loss=1.289, discriminator_fake_loss=1.4, generator_loss=31.81, generator_mel_loss=23.73, generator_kl_loss=1.963, generator_dur_loss=1.584, generator_adv_loss=2.051, generator_feat_match_loss=2.484, over 56.00 samples.], tot_loss[discriminator_loss=2.722, discriminator_real_loss=1.385, discriminator_fake_loss=1.337, generator_loss=30.95, generator_mel_loss=23.27, generator_kl_loss=1.935, generator_dur_loss=1.565, generator_adv_loss=1.906, generator_feat_match_loss=2.27, over 2468.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:15:41,827 INFO [train.py:845] (3/4) Start epoch 112 2024-02-22 16:19:13,465 INFO [train.py:845] (3/4) Start epoch 113 2024-02-22 16:19:56,106 INFO [train.py:471] (3/4) Epoch 113, batch 6, global_batch_idx: 4150, batch size: 81, loss[discriminator_loss=2.641, discriminator_real_loss=1.457, discriminator_fake_loss=1.184, generator_loss=30.45, generator_mel_loss=22.79, generator_kl_loss=1.914, generator_dur_loss=1.586, generator_adv_loss=1.738, generator_feat_match_loss=2.414, over 81.00 samples.], tot_loss[discriminator_loss=2.698, discriminator_real_loss=1.394, discriminator_fake_loss=1.304, generator_loss=30.7, generator_mel_loss=23.05, generator_kl_loss=1.908, generator_dur_loss=1.582, generator_adv_loss=1.882, generator_feat_match_loss=2.285, over 450.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:22:37,724 INFO [train.py:845] (3/4) Start epoch 114 2024-02-22 16:24:38,570 INFO [train.py:471] (3/4) Epoch 114, batch 19, global_batch_idx: 4200, batch size: 61, loss[discriminator_loss=2.758, discriminator_real_loss=1.33, discriminator_fake_loss=1.428, generator_loss=30.62, generator_mel_loss=22.95, generator_kl_loss=1.871, generator_dur_loss=1.561, generator_adv_loss=1.867, generator_feat_match_loss=2.379, over 61.00 samples.], tot_loss[discriminator_loss=2.702, discriminator_real_loss=1.377, discriminator_fake_loss=1.325, generator_loss=30.67, generator_mel_loss=22.95, generator_kl_loss=1.903, generator_dur_loss=1.569, generator_adv_loss=1.919, generator_feat_match_loss=2.328, over 1372.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:24:38,572 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 16:24:46,975 INFO [train.py:534] (3/4) Epoch 114, validation: discriminator_loss=2.712, discriminator_real_loss=1.322, discriminator_fake_loss=1.39, generator_loss=32.6, generator_mel_loss=24.62, generator_kl_loss=2.024, generator_dur_loss=1.562, generator_adv_loss=1.793, generator_feat_match_loss=2.605, over 100.00 samples. 2024-02-22 16:24:46,976 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 16:26:11,325 INFO [train.py:845] (3/4) Start epoch 115 2024-02-22 16:29:16,207 INFO [train.py:471] (3/4) Epoch 115, batch 32, global_batch_idx: 4250, batch size: 53, loss[discriminator_loss=2.701, discriminator_real_loss=1.312, discriminator_fake_loss=1.389, generator_loss=30.3, generator_mel_loss=22.55, generator_kl_loss=1.903, generator_dur_loss=1.553, generator_adv_loss=1.984, generator_feat_match_loss=2.305, over 53.00 samples.], tot_loss[discriminator_loss=2.762, discriminator_real_loss=1.412, discriminator_fake_loss=1.351, generator_loss=30.61, generator_mel_loss=23, generator_kl_loss=1.919, generator_dur_loss=1.564, generator_adv_loss=1.906, generator_feat_match_loss=2.226, over 2317.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:29:38,744 INFO [train.py:845] (3/4) Start epoch 116 2024-02-22 16:33:01,026 INFO [train.py:845] (3/4) Start epoch 117 2024-02-22 16:33:58,388 INFO [train.py:471] (3/4) Epoch 117, batch 8, global_batch_idx: 4300, batch size: 101, loss[discriminator_loss=2.699, discriminator_real_loss=1.42, discriminator_fake_loss=1.279, generator_loss=31.03, generator_mel_loss=23.36, generator_kl_loss=1.923, generator_dur_loss=1.551, generator_adv_loss=1.854, generator_feat_match_loss=2.346, over 101.00 samples.], tot_loss[discriminator_loss=2.702, discriminator_real_loss=1.363, discriminator_fake_loss=1.338, generator_loss=30.9, generator_mel_loss=23.17, generator_kl_loss=1.908, generator_dur_loss=1.563, generator_adv_loss=1.915, generator_feat_match_loss=2.342, over 745.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:36:35,873 INFO [train.py:845] (3/4) Start epoch 118 2024-02-22 16:38:48,971 INFO [train.py:471] (3/4) Epoch 118, batch 21, global_batch_idx: 4350, batch size: 59, loss[discriminator_loss=2.732, discriminator_real_loss=1.265, discriminator_fake_loss=1.468, generator_loss=31.4, generator_mel_loss=23.46, generator_kl_loss=1.984, generator_dur_loss=1.599, generator_adv_loss=2.088, generator_feat_match_loss=2.275, over 59.00 samples.], tot_loss[discriminator_loss=2.735, discriminator_real_loss=1.394, discriminator_fake_loss=1.341, generator_loss=30.75, generator_mel_loss=23.06, generator_kl_loss=1.926, generator_dur_loss=1.569, generator_adv_loss=1.893, generator_feat_match_loss=2.301, over 1603.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:40:06,409 INFO [train.py:845] (3/4) Start epoch 119 2024-02-22 16:43:28,110 INFO [train.py:471] (3/4) Epoch 119, batch 34, global_batch_idx: 4400, batch size: 69, loss[discriminator_loss=2.744, discriminator_real_loss=1.603, discriminator_fake_loss=1.142, generator_loss=29.27, generator_mel_loss=21.82, generator_kl_loss=1.864, generator_dur_loss=1.548, generator_adv_loss=1.828, generator_feat_match_loss=2.217, over 69.00 samples.], tot_loss[discriminator_loss=2.732, discriminator_real_loss=1.392, discriminator_fake_loss=1.34, generator_loss=30.77, generator_mel_loss=23.07, generator_kl_loss=1.945, generator_dur_loss=1.561, generator_adv_loss=1.9, generator_feat_match_loss=2.293, over 2526.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:43:28,111 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 16:43:36,720 INFO [train.py:534] (3/4) Epoch 119, validation: discriminator_loss=2.647, discriminator_real_loss=1.239, discriminator_fake_loss=1.408, generator_loss=31.25, generator_mel_loss=23.6, generator_kl_loss=1.934, generator_dur_loss=1.557, generator_adv_loss=1.718, generator_feat_match_loss=2.443, over 100.00 samples. 2024-02-22 16:43:36,720 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 16:43:46,581 INFO [train.py:845] (3/4) Start epoch 120 2024-02-22 16:47:15,390 INFO [train.py:845] (3/4) Start epoch 121 2024-02-22 16:48:24,268 INFO [train.py:471] (3/4) Epoch 121, batch 10, global_batch_idx: 4450, batch size: 101, loss[discriminator_loss=2.699, discriminator_real_loss=1.294, discriminator_fake_loss=1.404, generator_loss=31.35, generator_mel_loss=23.6, generator_kl_loss=2.046, generator_dur_loss=1.553, generator_adv_loss=1.897, generator_feat_match_loss=2.246, over 101.00 samples.], tot_loss[discriminator_loss=2.746, discriminator_real_loss=1.375, discriminator_fake_loss=1.371, generator_loss=30.78, generator_mel_loss=23.12, generator_kl_loss=1.948, generator_dur_loss=1.567, generator_adv_loss=1.893, generator_feat_match_loss=2.258, over 712.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:50:46,914 INFO [train.py:845] (3/4) Start epoch 122 2024-02-22 16:53:06,842 INFO [train.py:471] (3/4) Epoch 122, batch 23, global_batch_idx: 4500, batch size: 53, loss[discriminator_loss=2.695, discriminator_real_loss=1.416, discriminator_fake_loss=1.278, generator_loss=30.81, generator_mel_loss=23.07, generator_kl_loss=1.972, generator_dur_loss=1.577, generator_adv_loss=1.814, generator_feat_match_loss=2.377, over 53.00 samples.], tot_loss[discriminator_loss=2.733, discriminator_real_loss=1.387, discriminator_fake_loss=1.345, generator_loss=30.76, generator_mel_loss=23.02, generator_kl_loss=1.92, generator_dur_loss=1.552, generator_adv_loss=1.942, generator_feat_match_loss=2.323, over 1875.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:54:14,802 INFO [train.py:845] (3/4) Start epoch 123 2024-02-22 16:57:38,763 INFO [train.py:471] (3/4) Epoch 123, batch 36, global_batch_idx: 4550, batch size: 60, loss[discriminator_loss=2.717, discriminator_real_loss=1.354, discriminator_fake_loss=1.363, generator_loss=31.57, generator_mel_loss=23.54, generator_kl_loss=1.951, generator_dur_loss=1.57, generator_adv_loss=2.145, generator_feat_match_loss=2.363, over 60.00 samples.], tot_loss[discriminator_loss=2.723, discriminator_real_loss=1.384, discriminator_fake_loss=1.338, generator_loss=30.74, generator_mel_loss=23.03, generator_kl_loss=1.931, generator_dur_loss=1.556, generator_adv_loss=1.889, generator_feat_match_loss=2.329, over 2761.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 16:57:39,210 INFO [train.py:845] (3/4) Start epoch 124 2024-02-22 17:01:06,726 INFO [train.py:845] (3/4) Start epoch 125 2024-02-22 17:02:30,278 INFO [train.py:471] (3/4) Epoch 125, batch 12, global_batch_idx: 4600, batch size: 59, loss[discriminator_loss=2.742, discriminator_real_loss=1.256, discriminator_fake_loss=1.486, generator_loss=31.2, generator_mel_loss=23.07, generator_kl_loss=2.032, generator_dur_loss=1.558, generator_adv_loss=2.148, generator_feat_match_loss=2.393, over 59.00 samples.], tot_loss[discriminator_loss=2.731, discriminator_real_loss=1.382, discriminator_fake_loss=1.348, generator_loss=30.71, generator_mel_loss=23, generator_kl_loss=1.905, generator_dur_loss=1.564, generator_adv_loss=1.92, generator_feat_match_loss=2.322, over 895.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:02:30,279 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 17:02:38,870 INFO [train.py:534] (3/4) Epoch 125, validation: discriminator_loss=2.717, discriminator_real_loss=1.458, discriminator_fake_loss=1.259, generator_loss=32.11, generator_mel_loss=24.07, generator_kl_loss=1.999, generator_dur_loss=1.546, generator_adv_loss=2.02, generator_feat_match_loss=2.477, over 100.00 samples. 2024-02-22 17:02:38,870 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 17:04:48,274 INFO [train.py:845] (3/4) Start epoch 126 2024-02-22 17:07:12,594 INFO [train.py:471] (3/4) Epoch 126, batch 25, global_batch_idx: 4650, batch size: 56, loss[discriminator_loss=2.865, discriminator_real_loss=1.361, discriminator_fake_loss=1.504, generator_loss=31.53, generator_mel_loss=23.22, generator_kl_loss=1.998, generator_dur_loss=1.555, generator_adv_loss=2.428, generator_feat_match_loss=2.322, over 56.00 samples.], tot_loss[discriminator_loss=2.73, discriminator_real_loss=1.387, discriminator_fake_loss=1.343, generator_loss=30.87, generator_mel_loss=23.02, generator_kl_loss=1.946, generator_dur_loss=1.557, generator_adv_loss=1.944, generator_feat_match_loss=2.399, over 1667.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:08:11,814 INFO [train.py:845] (3/4) Start epoch 127 2024-02-22 17:11:46,562 INFO [train.py:845] (3/4) Start epoch 128 2024-02-22 17:12:09,138 INFO [train.py:471] (3/4) Epoch 128, batch 1, global_batch_idx: 4700, batch size: 153, loss[discriminator_loss=2.764, discriminator_real_loss=1.562, discriminator_fake_loss=1.202, generator_loss=31.1, generator_mel_loss=23.49, generator_kl_loss=1.952, generator_dur_loss=1.506, generator_adv_loss=1.897, generator_feat_match_loss=2.258, over 153.00 samples.], tot_loss[discriminator_loss=2.787, discriminator_real_loss=1.514, discriminator_fake_loss=1.273, generator_loss=30.85, generator_mel_loss=23.24, generator_kl_loss=1.928, generator_dur_loss=1.517, generator_adv_loss=1.924, generator_feat_match_loss=2.241, over 213.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:15:20,476 INFO [train.py:845] (3/4) Start epoch 129 2024-02-22 17:16:47,581 INFO [train.py:471] (3/4) Epoch 129, batch 14, global_batch_idx: 4750, batch size: 110, loss[discriminator_loss=2.719, discriminator_real_loss=1.495, discriminator_fake_loss=1.223, generator_loss=30.83, generator_mel_loss=23.04, generator_kl_loss=1.94, generator_dur_loss=1.54, generator_adv_loss=1.944, generator_feat_match_loss=2.363, over 110.00 samples.], tot_loss[discriminator_loss=2.743, discriminator_real_loss=1.384, discriminator_fake_loss=1.359, generator_loss=30.56, generator_mel_loss=22.79, generator_kl_loss=1.937, generator_dur_loss=1.545, generator_adv_loss=1.937, generator_feat_match_loss=2.347, over 1323.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:18:52,310 INFO [train.py:845] (3/4) Start epoch 130 2024-02-22 17:21:28,610 INFO [train.py:471] (3/4) Epoch 130, batch 27, global_batch_idx: 4800, batch size: 64, loss[discriminator_loss=2.666, discriminator_real_loss=1.357, discriminator_fake_loss=1.309, generator_loss=30.11, generator_mel_loss=22.37, generator_kl_loss=1.865, generator_dur_loss=1.553, generator_adv_loss=1.998, generator_feat_match_loss=2.328, over 64.00 samples.], tot_loss[discriminator_loss=2.751, discriminator_real_loss=1.397, discriminator_fake_loss=1.354, generator_loss=30.35, generator_mel_loss=22.72, generator_kl_loss=1.93, generator_dur_loss=1.552, generator_adv_loss=1.88, generator_feat_match_loss=2.27, over 2088.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:21:28,611 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 17:21:37,066 INFO [train.py:534] (3/4) Epoch 130, validation: discriminator_loss=2.623, discriminator_real_loss=1.363, discriminator_fake_loss=1.26, generator_loss=31.27, generator_mel_loss=23.43, generator_kl_loss=1.931, generator_dur_loss=1.541, generator_adv_loss=1.919, generator_feat_match_loss=2.456, over 100.00 samples. 2024-02-22 17:21:37,066 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 17:22:26,867 INFO [train.py:845] (3/4) Start epoch 131 2024-02-22 17:25:55,321 INFO [train.py:845] (3/4) Start epoch 132 2024-02-22 17:26:27,005 INFO [train.py:471] (3/4) Epoch 132, batch 3, global_batch_idx: 4850, batch size: 101, loss[discriminator_loss=2.711, discriminator_real_loss=1.452, discriminator_fake_loss=1.26, generator_loss=31.04, generator_mel_loss=23.28, generator_kl_loss=2.012, generator_dur_loss=1.539, generator_adv_loss=1.846, generator_feat_match_loss=2.369, over 101.00 samples.], tot_loss[discriminator_loss=2.732, discriminator_real_loss=1.388, discriminator_fake_loss=1.345, generator_loss=30.78, generator_mel_loss=23.08, generator_kl_loss=1.949, generator_dur_loss=1.533, generator_adv_loss=1.886, generator_feat_match_loss=2.335, over 418.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:29:22,427 INFO [train.py:845] (3/4) Start epoch 133 2024-02-22 17:30:56,805 INFO [train.py:471] (3/4) Epoch 133, batch 16, global_batch_idx: 4900, batch size: 110, loss[discriminator_loss=2.686, discriminator_real_loss=1.377, discriminator_fake_loss=1.309, generator_loss=30.78, generator_mel_loss=23.07, generator_kl_loss=1.943, generator_dur_loss=1.508, generator_adv_loss=1.856, generator_feat_match_loss=2.406, over 110.00 samples.], tot_loss[discriminator_loss=2.739, discriminator_real_loss=1.392, discriminator_fake_loss=1.347, generator_loss=30.63, generator_mel_loss=22.93, generator_kl_loss=1.964, generator_dur_loss=1.544, generator_adv_loss=1.869, generator_feat_match_loss=2.325, over 1411.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:32:49,915 INFO [train.py:845] (3/4) Start epoch 134 2024-02-22 17:35:37,053 INFO [train.py:471] (3/4) Epoch 134, batch 29, global_batch_idx: 4950, batch size: 52, loss[discriminator_loss=2.699, discriminator_real_loss=1.303, discriminator_fake_loss=1.397, generator_loss=30.95, generator_mel_loss=23.12, generator_kl_loss=1.88, generator_dur_loss=1.572, generator_adv_loss=1.938, generator_feat_match_loss=2.434, over 52.00 samples.], tot_loss[discriminator_loss=2.716, discriminator_real_loss=1.377, discriminator_fake_loss=1.339, generator_loss=30.68, generator_mel_loss=22.86, generator_kl_loss=1.93, generator_dur_loss=1.55, generator_adv_loss=1.909, generator_feat_match_loss=2.424, over 2249.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:36:15,983 INFO [train.py:845] (3/4) Start epoch 135 2024-02-22 17:39:41,761 INFO [train.py:845] (3/4) Start epoch 136 2024-02-22 17:40:21,138 INFO [train.py:471] (3/4) Epoch 136, batch 5, global_batch_idx: 5000, batch size: 76, loss[discriminator_loss=2.709, discriminator_real_loss=1.5, discriminator_fake_loss=1.209, generator_loss=30.61, generator_mel_loss=22.83, generator_kl_loss=1.975, generator_dur_loss=1.544, generator_adv_loss=1.756, generator_feat_match_loss=2.512, over 76.00 samples.], tot_loss[discriminator_loss=2.713, discriminator_real_loss=1.407, discriminator_fake_loss=1.306, generator_loss=30.19, generator_mel_loss=22.4, generator_kl_loss=1.916, generator_dur_loss=1.556, generator_adv_loss=1.909, generator_feat_match_loss=2.417, over 406.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:40:21,140 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 17:40:30,080 INFO [train.py:534] (3/4) Epoch 136, validation: discriminator_loss=2.765, discriminator_real_loss=1.236, discriminator_fake_loss=1.53, generator_loss=31.5, generator_mel_loss=23.68, generator_kl_loss=2.09, generator_dur_loss=1.539, generator_adv_loss=1.613, generator_feat_match_loss=2.586, over 100.00 samples. 2024-02-22 17:40:30,081 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 17:43:14,225 INFO [train.py:845] (3/4) Start epoch 137 2024-02-22 17:45:06,359 INFO [train.py:471] (3/4) Epoch 137, batch 18, global_batch_idx: 5050, batch size: 79, loss[discriminator_loss=2.766, discriminator_real_loss=1.423, discriminator_fake_loss=1.343, generator_loss=29.96, generator_mel_loss=22.33, generator_kl_loss=1.933, generator_dur_loss=1.551, generator_adv_loss=1.928, generator_feat_match_loss=2.223, over 79.00 samples.], tot_loss[discriminator_loss=2.736, discriminator_real_loss=1.4, discriminator_fake_loss=1.336, generator_loss=30.37, generator_mel_loss=22.68, generator_kl_loss=1.932, generator_dur_loss=1.553, generator_adv_loss=1.893, generator_feat_match_loss=2.316, over 1405.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:46:45,600 INFO [train.py:845] (3/4) Start epoch 138 2024-02-22 17:49:50,528 INFO [train.py:471] (3/4) Epoch 138, batch 31, global_batch_idx: 5100, batch size: 56, loss[discriminator_loss=2.707, discriminator_real_loss=1.485, discriminator_fake_loss=1.223, generator_loss=29.97, generator_mel_loss=22.39, generator_kl_loss=1.846, generator_dur_loss=1.551, generator_adv_loss=1.86, generator_feat_match_loss=2.322, over 56.00 samples.], tot_loss[discriminator_loss=2.729, discriminator_real_loss=1.388, discriminator_fake_loss=1.341, generator_loss=30.41, generator_mel_loss=22.63, generator_kl_loss=1.917, generator_dur_loss=1.544, generator_adv_loss=1.908, generator_feat_match_loss=2.411, over 2365.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:50:18,424 INFO [train.py:845] (3/4) Start epoch 139 2024-02-22 17:53:43,296 INFO [train.py:845] (3/4) Start epoch 140 2024-02-22 17:54:33,521 INFO [train.py:471] (3/4) Epoch 140, batch 7, global_batch_idx: 5150, batch size: 60, loss[discriminator_loss=2.715, discriminator_real_loss=1.467, discriminator_fake_loss=1.247, generator_loss=30.71, generator_mel_loss=23.02, generator_kl_loss=1.877, generator_dur_loss=1.535, generator_adv_loss=1.85, generator_feat_match_loss=2.42, over 60.00 samples.], tot_loss[discriminator_loss=2.761, discriminator_real_loss=1.416, discriminator_fake_loss=1.344, generator_loss=30.27, generator_mel_loss=22.62, generator_kl_loss=1.884, generator_dur_loss=1.543, generator_adv_loss=1.879, generator_feat_match_loss=2.339, over 497.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:57:17,885 INFO [train.py:845] (3/4) Start epoch 141 2024-02-22 17:59:20,191 INFO [train.py:471] (3/4) Epoch 141, batch 20, global_batch_idx: 5200, batch size: 60, loss[discriminator_loss=2.816, discriminator_real_loss=1.263, discriminator_fake_loss=1.555, generator_loss=30.81, generator_mel_loss=22.79, generator_kl_loss=2.023, generator_dur_loss=1.563, generator_adv_loss=2.072, generator_feat_match_loss=2.365, over 60.00 samples.], tot_loss[discriminator_loss=2.736, discriminator_real_loss=1.41, discriminator_fake_loss=1.326, generator_loss=30.36, generator_mel_loss=22.67, generator_kl_loss=1.924, generator_dur_loss=1.541, generator_adv_loss=1.877, generator_feat_match_loss=2.348, over 1660.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 17:59:20,192 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 17:59:28,675 INFO [train.py:534] (3/4) Epoch 141, validation: discriminator_loss=2.677, discriminator_real_loss=1.462, discriminator_fake_loss=1.214, generator_loss=31.78, generator_mel_loss=23.67, generator_kl_loss=2.071, generator_dur_loss=1.541, generator_adv_loss=1.969, generator_feat_match_loss=2.528, over 100.00 samples. 2024-02-22 17:59:28,676 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 18:00:49,809 INFO [train.py:845] (3/4) Start epoch 142 2024-02-22 18:03:58,579 INFO [train.py:471] (3/4) Epoch 142, batch 33, global_batch_idx: 5250, batch size: 63, loss[discriminator_loss=2.766, discriminator_real_loss=1.376, discriminator_fake_loss=1.389, generator_loss=29.9, generator_mel_loss=22.31, generator_kl_loss=1.981, generator_dur_loss=1.521, generator_adv_loss=1.928, generator_feat_match_loss=2.164, over 63.00 samples.], tot_loss[discriminator_loss=2.776, discriminator_real_loss=1.43, discriminator_fake_loss=1.346, generator_loss=30.41, generator_mel_loss=22.63, generator_kl_loss=1.936, generator_dur_loss=1.542, generator_adv_loss=1.925, generator_feat_match_loss=2.373, over 2641.00 samples.], cur_lr_g: 1.97e-04, cur_lr_d: 1.97e-04, grad_scale: 128.0 2024-02-22 18:04:16,213 INFO [train.py:845] (3/4) Start epoch 143 2024-02-22 18:07:41,598 INFO [train.py:845] (3/4) Start epoch 144 2024-02-22 18:08:48,249 INFO [train.py:471] (3/4) Epoch 144, batch 9, global_batch_idx: 5300, batch size: 110, loss[discriminator_loss=2.723, discriminator_real_loss=1.271, discriminator_fake_loss=1.45, generator_loss=30.77, generator_mel_loss=22.89, generator_kl_loss=1.965, generator_dur_loss=1.52, generator_adv_loss=1.987, generator_feat_match_loss=2.412, over 110.00 samples.], tot_loss[discriminator_loss=2.729, discriminator_real_loss=1.394, discriminator_fake_loss=1.335, generator_loss=30.4, generator_mel_loss=22.62, generator_kl_loss=1.933, generator_dur_loss=1.551, generator_adv_loss=1.886, generator_feat_match_loss=2.405, over 746.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:11:14,614 INFO [train.py:845] (3/4) Start epoch 145 2024-02-22 18:13:26,091 INFO [train.py:471] (3/4) Epoch 145, batch 22, global_batch_idx: 5350, batch size: 102, loss[discriminator_loss=2.754, discriminator_real_loss=1.375, discriminator_fake_loss=1.378, generator_loss=30.22, generator_mel_loss=22.62, generator_kl_loss=1.77, generator_dur_loss=1.522, generator_adv_loss=1.782, generator_feat_match_loss=2.527, over 102.00 samples.], tot_loss[discriminator_loss=2.733, discriminator_real_loss=1.388, discriminator_fake_loss=1.345, generator_loss=30.3, generator_mel_loss=22.53, generator_kl_loss=1.881, generator_dur_loss=1.54, generator_adv_loss=1.901, generator_feat_match_loss=2.445, over 1798.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:14:46,131 INFO [train.py:845] (3/4) Start epoch 146 2024-02-22 18:18:09,063 INFO [train.py:471] (3/4) Epoch 146, batch 35, global_batch_idx: 5400, batch size: 61, loss[discriminator_loss=2.689, discriminator_real_loss=1.386, discriminator_fake_loss=1.304, generator_loss=30.06, generator_mel_loss=22.27, generator_kl_loss=1.869, generator_dur_loss=1.546, generator_adv_loss=1.821, generator_feat_match_loss=2.553, over 61.00 samples.], tot_loss[discriminator_loss=2.753, discriminator_real_loss=1.404, discriminator_fake_loss=1.349, generator_loss=30.42, generator_mel_loss=22.66, generator_kl_loss=1.92, generator_dur_loss=1.536, generator_adv_loss=1.896, generator_feat_match_loss=2.409, over 2786.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:18:09,064 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 18:18:19,170 INFO [train.py:534] (3/4) Epoch 146, validation: discriminator_loss=2.684, discriminator_real_loss=1.228, discriminator_fake_loss=1.456, generator_loss=31.15, generator_mel_loss=23.34, generator_kl_loss=1.962, generator_dur_loss=1.539, generator_adv_loss=1.713, generator_feat_match_loss=2.603, over 100.00 samples. 2024-02-22 18:18:19,171 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 18:18:24,296 INFO [train.py:845] (3/4) Start epoch 147 2024-02-22 18:21:51,147 INFO [train.py:845] (3/4) Start epoch 148 2024-02-22 18:22:58,077 INFO [train.py:471] (3/4) Epoch 148, batch 11, global_batch_idx: 5450, batch size: 85, loss[discriminator_loss=2.764, discriminator_real_loss=1.442, discriminator_fake_loss=1.321, generator_loss=30.13, generator_mel_loss=22.41, generator_kl_loss=1.977, generator_dur_loss=1.551, generator_adv_loss=1.79, generator_feat_match_loss=2.406, over 85.00 samples.], tot_loss[discriminator_loss=2.718, discriminator_real_loss=1.367, discriminator_fake_loss=1.351, generator_loss=30.64, generator_mel_loss=22.7, generator_kl_loss=1.926, generator_dur_loss=1.548, generator_adv_loss=1.921, generator_feat_match_loss=2.544, over 871.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:25:17,263 INFO [train.py:845] (3/4) Start epoch 149 2024-02-22 18:27:41,934 INFO [train.py:471] (3/4) Epoch 149, batch 24, global_batch_idx: 5500, batch size: 60, loss[discriminator_loss=2.723, discriminator_real_loss=1.455, discriminator_fake_loss=1.268, generator_loss=30.73, generator_mel_loss=22.89, generator_kl_loss=1.905, generator_dur_loss=1.569, generator_adv_loss=1.929, generator_feat_match_loss=2.445, over 60.00 samples.], tot_loss[discriminator_loss=2.728, discriminator_real_loss=1.398, discriminator_fake_loss=1.329, generator_loss=30.34, generator_mel_loss=22.52, generator_kl_loss=1.922, generator_dur_loss=1.545, generator_adv_loss=1.904, generator_feat_match_loss=2.447, over 1660.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:28:48,712 INFO [train.py:845] (3/4) Start epoch 150 2024-02-22 18:32:12,609 INFO [train.py:845] (3/4) Start epoch 151 2024-02-22 18:32:24,594 INFO [train.py:471] (3/4) Epoch 151, batch 0, global_batch_idx: 5550, batch size: 69, loss[discriminator_loss=2.777, discriminator_real_loss=1.416, discriminator_fake_loss=1.362, generator_loss=30.35, generator_mel_loss=22.78, generator_kl_loss=1.903, generator_dur_loss=1.53, generator_adv_loss=1.88, generator_feat_match_loss=2.262, over 69.00 samples.], tot_loss[discriminator_loss=2.777, discriminator_real_loss=1.416, discriminator_fake_loss=1.362, generator_loss=30.35, generator_mel_loss=22.78, generator_kl_loss=1.903, generator_dur_loss=1.53, generator_adv_loss=1.88, generator_feat_match_loss=2.262, over 69.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:35:39,329 INFO [train.py:845] (3/4) Start epoch 152 2024-02-22 18:36:57,596 INFO [train.py:471] (3/4) Epoch 152, batch 13, global_batch_idx: 5600, batch size: 53, loss[discriminator_loss=2.738, discriminator_real_loss=1.309, discriminator_fake_loss=1.431, generator_loss=29.91, generator_mel_loss=21.87, generator_kl_loss=1.859, generator_dur_loss=1.553, generator_adv_loss=2.16, generator_feat_match_loss=2.463, over 53.00 samples.], tot_loss[discriminator_loss=2.707, discriminator_real_loss=1.383, discriminator_fake_loss=1.324, generator_loss=30.19, generator_mel_loss=22.37, generator_kl_loss=1.887, generator_dur_loss=1.54, generator_adv_loss=1.931, generator_feat_match_loss=2.47, over 969.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:36:57,597 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 18:37:06,545 INFO [train.py:534] (3/4) Epoch 152, validation: discriminator_loss=2.756, discriminator_real_loss=1.49, discriminator_fake_loss=1.266, generator_loss=31.42, generator_mel_loss=23.29, generator_kl_loss=1.951, generator_dur_loss=1.531, generator_adv_loss=2.054, generator_feat_match_loss=2.602, over 100.00 samples. 2024-02-22 18:37:06,546 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 18:39:12,223 INFO [train.py:845] (3/4) Start epoch 153 2024-02-22 18:41:47,655 INFO [train.py:471] (3/4) Epoch 153, batch 26, global_batch_idx: 5650, batch size: 55, loss[discriminator_loss=2.725, discriminator_real_loss=1.298, discriminator_fake_loss=1.427, generator_loss=30.1, generator_mel_loss=22.5, generator_kl_loss=1.839, generator_dur_loss=1.573, generator_adv_loss=1.84, generator_feat_match_loss=2.35, over 55.00 samples.], tot_loss[discriminator_loss=2.74, discriminator_real_loss=1.397, discriminator_fake_loss=1.343, generator_loss=30.33, generator_mel_loss=22.56, generator_kl_loss=1.89, generator_dur_loss=1.541, generator_adv_loss=1.896, generator_feat_match_loss=2.437, over 2022.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:42:45,296 INFO [train.py:845] (3/4) Start epoch 154 2024-02-22 18:46:10,326 INFO [train.py:845] (3/4) Start epoch 155 2024-02-22 18:46:36,324 INFO [train.py:471] (3/4) Epoch 155, batch 2, global_batch_idx: 5700, batch size: 50, loss[discriminator_loss=2.672, discriminator_real_loss=1.29, discriminator_fake_loss=1.381, generator_loss=30.14, generator_mel_loss=22.2, generator_kl_loss=1.92, generator_dur_loss=1.552, generator_adv_loss=2.053, generator_feat_match_loss=2.414, over 50.00 samples.], tot_loss[discriminator_loss=2.699, discriminator_real_loss=1.385, discriminator_fake_loss=1.314, generator_loss=30.06, generator_mel_loss=22.38, generator_kl_loss=1.884, generator_dur_loss=1.541, generator_adv_loss=1.921, generator_feat_match_loss=2.34, over 187.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:49:43,763 INFO [train.py:845] (3/4) Start epoch 156 2024-02-22 18:51:20,104 INFO [train.py:471] (3/4) Epoch 156, batch 15, global_batch_idx: 5750, batch size: 126, loss[discriminator_loss=2.715, discriminator_real_loss=1.361, discriminator_fake_loss=1.353, generator_loss=31.04, generator_mel_loss=22.9, generator_kl_loss=1.913, generator_dur_loss=1.544, generator_adv_loss=2.051, generator_feat_match_loss=2.625, over 126.00 samples.], tot_loss[discriminator_loss=2.709, discriminator_real_loss=1.366, discriminator_fake_loss=1.343, generator_loss=30.37, generator_mel_loss=22.51, generator_kl_loss=1.884, generator_dur_loss=1.537, generator_adv_loss=1.919, generator_feat_match_loss=2.521, over 1258.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:53:17,240 INFO [train.py:845] (3/4) Start epoch 157 2024-02-22 18:56:06,308 INFO [train.py:471] (3/4) Epoch 157, batch 28, global_batch_idx: 5800, batch size: 52, loss[discriminator_loss=2.676, discriminator_real_loss=1.305, discriminator_fake_loss=1.371, generator_loss=29.96, generator_mel_loss=22.01, generator_kl_loss=1.898, generator_dur_loss=1.542, generator_adv_loss=2.016, generator_feat_match_loss=2.488, over 52.00 samples.], tot_loss[discriminator_loss=2.741, discriminator_real_loss=1.391, discriminator_fake_loss=1.35, generator_loss=30.11, generator_mel_loss=22.37, generator_kl_loss=1.927, generator_dur_loss=1.536, generator_adv_loss=1.877, generator_feat_match_loss=2.402, over 2075.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 18:56:06,309 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 18:56:14,946 INFO [train.py:534] (3/4) Epoch 157, validation: discriminator_loss=2.62, discriminator_real_loss=1.354, discriminator_fake_loss=1.267, generator_loss=30.58, generator_mel_loss=22.7, generator_kl_loss=1.914, generator_dur_loss=1.532, generator_adv_loss=1.954, generator_feat_match_loss=2.476, over 100.00 samples. 2024-02-22 18:56:14,948 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 18:56:53,339 INFO [train.py:845] (3/4) Start epoch 158 2024-02-22 19:00:21,671 INFO [train.py:845] (3/4) Start epoch 159 2024-02-22 19:01:00,205 INFO [train.py:471] (3/4) Epoch 159, batch 4, global_batch_idx: 5850, batch size: 61, loss[discriminator_loss=2.725, discriminator_real_loss=1.41, discriminator_fake_loss=1.314, generator_loss=30.16, generator_mel_loss=22.51, generator_kl_loss=1.947, generator_dur_loss=1.538, generator_adv_loss=1.785, generator_feat_match_loss=2.381, over 61.00 samples.], tot_loss[discriminator_loss=2.785, discriminator_real_loss=1.421, discriminator_fake_loss=1.364, generator_loss=29.91, generator_mel_loss=22.3, generator_kl_loss=1.891, generator_dur_loss=1.548, generator_adv_loss=1.852, generator_feat_match_loss=2.326, over 341.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 19:03:47,346 INFO [train.py:845] (3/4) Start epoch 160 2024-02-22 19:05:36,066 INFO [train.py:471] (3/4) Epoch 160, batch 17, global_batch_idx: 5900, batch size: 54, loss[discriminator_loss=2.785, discriminator_real_loss=1.324, discriminator_fake_loss=1.462, generator_loss=30.69, generator_mel_loss=22.74, generator_kl_loss=1.832, generator_dur_loss=1.569, generator_adv_loss=2.121, generator_feat_match_loss=2.426, over 54.00 samples.], tot_loss[discriminator_loss=2.739, discriminator_real_loss=1.388, discriminator_fake_loss=1.351, generator_loss=30.08, generator_mel_loss=22.2, generator_kl_loss=1.901, generator_dur_loss=1.537, generator_adv_loss=1.923, generator_feat_match_loss=2.518, over 1330.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 19:07:18,182 INFO [train.py:845] (3/4) Start epoch 161 2024-02-22 19:10:19,416 INFO [train.py:471] (3/4) Epoch 161, batch 30, global_batch_idx: 5950, batch size: 79, loss[discriminator_loss=2.689, discriminator_real_loss=1.386, discriminator_fake_loss=1.304, generator_loss=30.12, generator_mel_loss=22.3, generator_kl_loss=1.909, generator_dur_loss=1.551, generator_adv_loss=1.787, generator_feat_match_loss=2.574, over 79.00 samples.], tot_loss[discriminator_loss=2.754, discriminator_real_loss=1.407, discriminator_fake_loss=1.347, generator_loss=29.98, generator_mel_loss=22.24, generator_kl_loss=1.898, generator_dur_loss=1.531, generator_adv_loss=1.89, generator_feat_match_loss=2.42, over 2382.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 128.0 2024-02-22 19:10:46,818 INFO [train.py:845] (3/4) Start epoch 162 2024-02-22 19:14:19,473 INFO [train.py:845] (3/4) Start epoch 163 2024-02-22 19:15:10,138 INFO [train.py:471] (3/4) Epoch 163, batch 6, global_batch_idx: 6000, batch size: 58, loss[discriminator_loss=2.824, discriminator_real_loss=1.554, discriminator_fake_loss=1.271, generator_loss=29.76, generator_mel_loss=21.8, generator_kl_loss=1.988, generator_dur_loss=1.529, generator_adv_loss=1.989, generator_feat_match_loss=2.443, over 58.00 samples.], tot_loss[discriminator_loss=2.889, discriminator_real_loss=1.492, discriminator_fake_loss=1.397, generator_loss=30.27, generator_mel_loss=22.37, generator_kl_loss=1.941, generator_dur_loss=1.54, generator_adv_loss=1.991, generator_feat_match_loss=2.424, over 532.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 19:15:10,140 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 19:15:18,967 INFO [train.py:534] (3/4) Epoch 163, validation: discriminator_loss=2.726, discriminator_real_loss=1.517, discriminator_fake_loss=1.209, generator_loss=31.62, generator_mel_loss=23.67, generator_kl_loss=2.024, generator_dur_loss=1.523, generator_adv_loss=1.958, generator_feat_match_loss=2.451, over 100.00 samples. 2024-02-22 19:15:18,968 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 19:17:57,852 INFO [train.py:845] (3/4) Start epoch 164 2024-02-22 19:19:53,634 INFO [train.py:471] (3/4) Epoch 164, batch 19, global_batch_idx: 6050, batch size: 153, loss[discriminator_loss=2.709, discriminator_real_loss=1.277, discriminator_fake_loss=1.432, generator_loss=30.95, generator_mel_loss=22.87, generator_kl_loss=1.935, generator_dur_loss=1.513, generator_adv_loss=1.949, generator_feat_match_loss=2.684, over 153.00 samples.], tot_loss[discriminator_loss=2.702, discriminator_real_loss=1.389, discriminator_fake_loss=1.313, generator_loss=30.44, generator_mel_loss=22.47, generator_kl_loss=1.923, generator_dur_loss=1.526, generator_adv_loss=1.932, generator_feat_match_loss=2.588, over 1617.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 19:21:27,765 INFO [train.py:845] (3/4) Start epoch 165 2024-02-22 19:24:28,182 INFO [train.py:471] (3/4) Epoch 165, batch 32, global_batch_idx: 6100, batch size: 85, loss[discriminator_loss=2.75, discriminator_real_loss=1.271, discriminator_fake_loss=1.478, generator_loss=30.02, generator_mel_loss=22.29, generator_kl_loss=1.848, generator_dur_loss=1.538, generator_adv_loss=1.938, generator_feat_match_loss=2.406, over 85.00 samples.], tot_loss[discriminator_loss=2.715, discriminator_real_loss=1.386, discriminator_fake_loss=1.329, generator_loss=30.17, generator_mel_loss=22.31, generator_kl_loss=1.907, generator_dur_loss=1.542, generator_adv_loss=1.915, generator_feat_match_loss=2.5, over 2302.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 19:24:48,842 INFO [train.py:845] (3/4) Start epoch 166 2024-02-22 19:28:22,550 INFO [train.py:845] (3/4) Start epoch 167 2024-02-22 19:29:23,705 INFO [train.py:471] (3/4) Epoch 167, batch 8, global_batch_idx: 6150, batch size: 64, loss[discriminator_loss=2.707, discriminator_real_loss=1.322, discriminator_fake_loss=1.385, generator_loss=29.8, generator_mel_loss=21.9, generator_kl_loss=1.896, generator_dur_loss=1.538, generator_adv_loss=2.014, generator_feat_match_loss=2.453, over 64.00 samples.], tot_loss[discriminator_loss=2.718, discriminator_real_loss=1.362, discriminator_fake_loss=1.356, generator_loss=30.15, generator_mel_loss=22.32, generator_kl_loss=1.902, generator_dur_loss=1.541, generator_adv_loss=1.898, generator_feat_match_loss=2.488, over 597.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 19:31:52,148 INFO [train.py:845] (3/4) Start epoch 168 2024-02-22 19:34:10,706 INFO [train.py:471] (3/4) Epoch 168, batch 21, global_batch_idx: 6200, batch size: 73, loss[discriminator_loss=2.809, discriminator_real_loss=1.317, discriminator_fake_loss=1.49, generator_loss=30.14, generator_mel_loss=22.23, generator_kl_loss=1.836, generator_dur_loss=1.52, generator_adv_loss=2.072, generator_feat_match_loss=2.486, over 73.00 samples.], tot_loss[discriminator_loss=2.711, discriminator_real_loss=1.379, discriminator_fake_loss=1.333, generator_loss=30.3, generator_mel_loss=22.36, generator_kl_loss=1.901, generator_dur_loss=1.527, generator_adv_loss=1.935, generator_feat_match_loss=2.574, over 1725.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 19:34:10,707 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 19:34:19,824 INFO [train.py:534] (3/4) Epoch 168, validation: discriminator_loss=2.789, discriminator_real_loss=1.408, discriminator_fake_loss=1.381, generator_loss=30.81, generator_mel_loss=22.79, generator_kl_loss=2.045, generator_dur_loss=1.523, generator_adv_loss=1.939, generator_feat_match_loss=2.519, over 100.00 samples. 2024-02-22 19:34:19,826 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 19:35:34,526 INFO [train.py:845] (3/4) Start epoch 169 2024-02-22 19:38:55,352 INFO [train.py:471] (3/4) Epoch 169, batch 34, global_batch_idx: 6250, batch size: 67, loss[discriminator_loss=2.711, discriminator_real_loss=1.345, discriminator_fake_loss=1.367, generator_loss=30.4, generator_mel_loss=22.42, generator_kl_loss=1.865, generator_dur_loss=1.559, generator_adv_loss=1.872, generator_feat_match_loss=2.688, over 67.00 samples.], tot_loss[discriminator_loss=2.715, discriminator_real_loss=1.379, discriminator_fake_loss=1.337, generator_loss=30.3, generator_mel_loss=22.34, generator_kl_loss=1.911, generator_dur_loss=1.534, generator_adv_loss=1.939, generator_feat_match_loss=2.576, over 2504.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 19:39:05,055 INFO [train.py:845] (3/4) Start epoch 170 2024-02-22 19:42:35,196 INFO [train.py:845] (3/4) Start epoch 171 2024-02-22 19:43:47,672 INFO [train.py:471] (3/4) Epoch 171, batch 10, global_batch_idx: 6300, batch size: 64, loss[discriminator_loss=2.748, discriminator_real_loss=1.549, discriminator_fake_loss=1.199, generator_loss=29.88, generator_mel_loss=22.08, generator_kl_loss=1.936, generator_dur_loss=1.537, generator_adv_loss=1.736, generator_feat_match_loss=2.588, over 64.00 samples.], tot_loss[discriminator_loss=2.714, discriminator_real_loss=1.393, discriminator_fake_loss=1.321, generator_loss=29.81, generator_mel_loss=22.02, generator_kl_loss=1.918, generator_dur_loss=1.537, generator_adv_loss=1.897, generator_feat_match_loss=2.44, over 789.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 19:46:12,347 INFO [train.py:845] (3/4) Start epoch 172 2024-02-22 19:48:22,596 INFO [train.py:471] (3/4) Epoch 172, batch 23, global_batch_idx: 6350, batch size: 69, loss[discriminator_loss=2.672, discriminator_real_loss=1.439, discriminator_fake_loss=1.231, generator_loss=29.95, generator_mel_loss=22.1, generator_kl_loss=1.907, generator_dur_loss=1.53, generator_adv_loss=1.896, generator_feat_match_loss=2.518, over 69.00 samples.], tot_loss[discriminator_loss=2.709, discriminator_real_loss=1.379, discriminator_fake_loss=1.329, generator_loss=30.14, generator_mel_loss=22.22, generator_kl_loss=1.904, generator_dur_loss=1.536, generator_adv_loss=1.915, generator_feat_match_loss=2.563, over 1769.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 19:49:38,489 INFO [train.py:845] (3/4) Start epoch 173 2024-02-22 19:53:07,641 INFO [train.py:471] (3/4) Epoch 173, batch 36, global_batch_idx: 6400, batch size: 61, loss[discriminator_loss=2.734, discriminator_real_loss=1.477, discriminator_fake_loss=1.257, generator_loss=29.57, generator_mel_loss=21.86, generator_kl_loss=1.928, generator_dur_loss=1.529, generator_adv_loss=1.785, generator_feat_match_loss=2.465, over 61.00 samples.], tot_loss[discriminator_loss=2.711, discriminator_real_loss=1.367, discriminator_fake_loss=1.344, generator_loss=30.08, generator_mel_loss=22.14, generator_kl_loss=1.91, generator_dur_loss=1.53, generator_adv_loss=1.921, generator_feat_match_loss=2.575, over 2808.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 19:53:07,643 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 19:53:15,945 INFO [train.py:534] (3/4) Epoch 173, validation: discriminator_loss=2.746, discriminator_real_loss=1.216, discriminator_fake_loss=1.53, generator_loss=30.82, generator_mel_loss=22.97, generator_kl_loss=2.056, generator_dur_loss=1.529, generator_adv_loss=1.662, generator_feat_match_loss=2.599, over 100.00 samples. 2024-02-22 19:53:15,946 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 19:53:16,404 INFO [train.py:845] (3/4) Start epoch 174 2024-02-22 19:56:49,219 INFO [train.py:845] (3/4) Start epoch 175 2024-02-22 19:58:14,355 INFO [train.py:471] (3/4) Epoch 175, batch 12, global_batch_idx: 6450, batch size: 51, loss[discriminator_loss=2.697, discriminator_real_loss=1.355, discriminator_fake_loss=1.342, generator_loss=30.23, generator_mel_loss=22.2, generator_kl_loss=1.862, generator_dur_loss=1.556, generator_adv_loss=2.041, generator_feat_match_loss=2.568, over 51.00 samples.], tot_loss[discriminator_loss=2.69, discriminator_real_loss=1.373, discriminator_fake_loss=1.317, generator_loss=30.13, generator_mel_loss=22.19, generator_kl_loss=1.904, generator_dur_loss=1.532, generator_adv_loss=1.897, generator_feat_match_loss=2.61, over 1035.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 20:00:21,807 INFO [train.py:845] (3/4) Start epoch 176 2024-02-22 20:02:44,751 INFO [train.py:471] (3/4) Epoch 176, batch 25, global_batch_idx: 6500, batch size: 110, loss[discriminator_loss=2.762, discriminator_real_loss=1.488, discriminator_fake_loss=1.273, generator_loss=30.17, generator_mel_loss=22.46, generator_kl_loss=1.884, generator_dur_loss=1.513, generator_adv_loss=1.786, generator_feat_match_loss=2.529, over 110.00 samples.], tot_loss[discriminator_loss=2.696, discriminator_real_loss=1.376, discriminator_fake_loss=1.32, generator_loss=30.32, generator_mel_loss=22.34, generator_kl_loss=1.893, generator_dur_loss=1.538, generator_adv_loss=1.932, generator_feat_match_loss=2.618, over 1803.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 20:03:49,142 INFO [train.py:845] (3/4) Start epoch 177 2024-02-22 20:07:16,904 INFO [train.py:845] (3/4) Start epoch 178 2024-02-22 20:07:34,521 INFO [train.py:471] (3/4) Epoch 178, batch 1, global_batch_idx: 6550, batch size: 69, loss[discriminator_loss=2.699, discriminator_real_loss=1.461, discriminator_fake_loss=1.237, generator_loss=30.57, generator_mel_loss=22.62, generator_kl_loss=1.972, generator_dur_loss=1.542, generator_adv_loss=1.881, generator_feat_match_loss=2.555, over 69.00 samples.], tot_loss[discriminator_loss=2.679, discriminator_real_loss=1.425, discriminator_fake_loss=1.253, generator_loss=30.69, generator_mel_loss=22.67, generator_kl_loss=1.925, generator_dur_loss=1.539, generator_adv_loss=1.975, generator_feat_match_loss=2.584, over 129.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 20:10:45,280 INFO [train.py:845] (3/4) Start epoch 179 2024-02-22 20:12:11,810 INFO [train.py:471] (3/4) Epoch 179, batch 14, global_batch_idx: 6600, batch size: 85, loss[discriminator_loss=2.615, discriminator_real_loss=1.244, discriminator_fake_loss=1.371, generator_loss=30.98, generator_mel_loss=22.65, generator_kl_loss=1.932, generator_dur_loss=1.523, generator_adv_loss=2.121, generator_feat_match_loss=2.748, over 85.00 samples.], tot_loss[discriminator_loss=2.722, discriminator_real_loss=1.374, discriminator_fake_loss=1.348, generator_loss=30.1, generator_mel_loss=22.2, generator_kl_loss=1.89, generator_dur_loss=1.529, generator_adv_loss=1.926, generator_feat_match_loss=2.559, over 1093.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 20:12:11,811 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 20:12:20,627 INFO [train.py:534] (3/4) Epoch 179, validation: discriminator_loss=2.647, discriminator_real_loss=1.388, discriminator_fake_loss=1.259, generator_loss=31.86, generator_mel_loss=23.53, generator_kl_loss=2.002, generator_dur_loss=1.527, generator_adv_loss=1.959, generator_feat_match_loss=2.849, over 100.00 samples. 2024-02-22 20:12:20,628 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 20:14:19,202 INFO [train.py:845] (3/4) Start epoch 180 2024-02-22 20:17:04,381 INFO [train.py:471] (3/4) Epoch 180, batch 27, global_batch_idx: 6650, batch size: 59, loss[discriminator_loss=2.721, discriminator_real_loss=1.094, discriminator_fake_loss=1.627, generator_loss=30.23, generator_mel_loss=21.84, generator_kl_loss=1.892, generator_dur_loss=1.546, generator_adv_loss=2.369, generator_feat_match_loss=2.588, over 59.00 samples.], tot_loss[discriminator_loss=2.708, discriminator_real_loss=1.38, discriminator_fake_loss=1.327, generator_loss=30.34, generator_mel_loss=22.33, generator_kl_loss=1.932, generator_dur_loss=1.526, generator_adv_loss=1.932, generator_feat_match_loss=2.614, over 2145.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 20:17:50,002 INFO [train.py:845] (3/4) Start epoch 181 2024-02-22 20:21:16,706 INFO [train.py:845] (3/4) Start epoch 182 2024-02-22 20:21:44,808 INFO [train.py:471] (3/4) Epoch 182, batch 3, global_batch_idx: 6700, batch size: 65, loss[discriminator_loss=2.641, discriminator_real_loss=1.411, discriminator_fake_loss=1.229, generator_loss=30.79, generator_mel_loss=22.72, generator_kl_loss=1.969, generator_dur_loss=1.537, generator_adv_loss=1.889, generator_feat_match_loss=2.68, over 65.00 samples.], tot_loss[discriminator_loss=2.7, discriminator_real_loss=1.395, discriminator_fake_loss=1.304, generator_loss=30.95, generator_mel_loss=22.93, generator_kl_loss=1.931, generator_dur_loss=1.532, generator_adv_loss=1.923, generator_feat_match_loss=2.639, over 296.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 20:24:39,558 INFO [train.py:845] (3/4) Start epoch 183 2024-02-22 20:26:20,640 INFO [train.py:471] (3/4) Epoch 183, batch 16, global_batch_idx: 6750, batch size: 51, loss[discriminator_loss=2.781, discriminator_real_loss=1.224, discriminator_fake_loss=1.559, generator_loss=30.31, generator_mel_loss=22.02, generator_kl_loss=1.892, generator_dur_loss=1.524, generator_adv_loss=2.361, generator_feat_match_loss=2.508, over 51.00 samples.], tot_loss[discriminator_loss=2.724, discriminator_real_loss=1.386, discriminator_fake_loss=1.338, generator_loss=29.91, generator_mel_loss=21.98, generator_kl_loss=1.893, generator_dur_loss=1.533, generator_adv_loss=1.941, generator_feat_match_loss=2.566, over 1064.00 samples.], cur_lr_g: 1.96e-04, cur_lr_d: 1.96e-04, grad_scale: 256.0 2024-02-22 20:28:09,004 INFO [train.py:845] (3/4) Start epoch 184 2024-02-22 20:31:04,469 INFO [train.py:471] (3/4) Epoch 184, batch 29, global_batch_idx: 6800, batch size: 85, loss[discriminator_loss=2.668, discriminator_real_loss=1.365, discriminator_fake_loss=1.303, generator_loss=30.4, generator_mel_loss=22.26, generator_kl_loss=1.952, generator_dur_loss=1.53, generator_adv_loss=1.873, generator_feat_match_loss=2.779, over 85.00 samples.], tot_loss[discriminator_loss=2.717, discriminator_real_loss=1.379, discriminator_fake_loss=1.338, generator_loss=29.98, generator_mel_loss=22.03, generator_kl_loss=1.903, generator_dur_loss=1.526, generator_adv_loss=1.925, generator_feat_match_loss=2.593, over 2298.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 20:31:04,471 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 20:31:12,977 INFO [train.py:534] (3/4) Epoch 184, validation: discriminator_loss=2.673, discriminator_real_loss=1.282, discriminator_fake_loss=1.391, generator_loss=31.15, generator_mel_loss=23.06, generator_kl_loss=1.91, generator_dur_loss=1.511, generator_adv_loss=1.797, generator_feat_match_loss=2.872, over 100.00 samples. 2024-02-22 20:31:12,978 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 20:31:47,188 INFO [train.py:845] (3/4) Start epoch 185 2024-02-22 20:35:15,921 INFO [train.py:845] (3/4) Start epoch 186 2024-02-22 20:35:54,661 INFO [train.py:471] (3/4) Epoch 186, batch 5, global_batch_idx: 6850, batch size: 50, loss[discriminator_loss=2.764, discriminator_real_loss=1.627, discriminator_fake_loss=1.137, generator_loss=29.34, generator_mel_loss=21.83, generator_kl_loss=1.826, generator_dur_loss=1.519, generator_adv_loss=1.758, generator_feat_match_loss=2.408, over 50.00 samples.], tot_loss[discriminator_loss=2.739, discriminator_real_loss=1.41, discriminator_fake_loss=1.329, generator_loss=29.74, generator_mel_loss=21.95, generator_kl_loss=1.855, generator_dur_loss=1.533, generator_adv_loss=1.887, generator_feat_match_loss=2.513, over 367.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 20:38:40,510 INFO [train.py:845] (3/4) Start epoch 187 2024-02-22 20:40:33,498 INFO [train.py:471] (3/4) Epoch 187, batch 18, global_batch_idx: 6900, batch size: 60, loss[discriminator_loss=2.715, discriminator_real_loss=1.334, discriminator_fake_loss=1.382, generator_loss=30.35, generator_mel_loss=22.14, generator_kl_loss=1.894, generator_dur_loss=1.535, generator_adv_loss=2.252, generator_feat_match_loss=2.527, over 60.00 samples.], tot_loss[discriminator_loss=2.737, discriminator_real_loss=1.412, discriminator_fake_loss=1.325, generator_loss=30.23, generator_mel_loss=22.2, generator_kl_loss=1.899, generator_dur_loss=1.526, generator_adv_loss=1.946, generator_feat_match_loss=2.659, over 1439.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 20:42:12,411 INFO [train.py:845] (3/4) Start epoch 188 2024-02-22 20:45:05,602 INFO [train.py:471] (3/4) Epoch 188, batch 31, global_batch_idx: 6950, batch size: 82, loss[discriminator_loss=2.732, discriminator_real_loss=1.473, discriminator_fake_loss=1.26, generator_loss=29.87, generator_mel_loss=22.01, generator_kl_loss=1.917, generator_dur_loss=1.547, generator_adv_loss=1.842, generator_feat_match_loss=2.551, over 82.00 samples.], tot_loss[discriminator_loss=2.707, discriminator_real_loss=1.382, discriminator_fake_loss=1.325, generator_loss=30.17, generator_mel_loss=22.17, generator_kl_loss=1.917, generator_dur_loss=1.525, generator_adv_loss=1.916, generator_feat_match_loss=2.64, over 2277.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 20:45:37,087 INFO [train.py:845] (3/4) Start epoch 189 2024-02-22 20:49:06,510 INFO [train.py:845] (3/4) Start epoch 190 2024-02-22 20:50:06,248 INFO [train.py:471] (3/4) Epoch 190, batch 7, global_batch_idx: 7000, batch size: 69, loss[discriminator_loss=2.652, discriminator_real_loss=1.325, discriminator_fake_loss=1.326, generator_loss=29.71, generator_mel_loss=21.71, generator_kl_loss=1.879, generator_dur_loss=1.524, generator_adv_loss=1.89, generator_feat_match_loss=2.711, over 69.00 samples.], tot_loss[discriminator_loss=2.705, discriminator_real_loss=1.383, discriminator_fake_loss=1.321, generator_loss=30.26, generator_mel_loss=22.22, generator_kl_loss=1.933, generator_dur_loss=1.527, generator_adv_loss=1.947, generator_feat_match_loss=2.632, over 614.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 20:50:06,250 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 20:50:14,853 INFO [train.py:534] (3/4) Epoch 190, validation: discriminator_loss=2.657, discriminator_real_loss=1.299, discriminator_fake_loss=1.358, generator_loss=31.16, generator_mel_loss=23, generator_kl_loss=2.042, generator_dur_loss=1.524, generator_adv_loss=1.778, generator_feat_match_loss=2.815, over 100.00 samples. 2024-02-22 20:50:14,854 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 20:52:48,889 INFO [train.py:845] (3/4) Start epoch 191 2024-02-22 20:54:46,337 INFO [train.py:471] (3/4) Epoch 191, batch 20, global_batch_idx: 7050, batch size: 126, loss[discriminator_loss=2.664, discriminator_real_loss=1.354, discriminator_fake_loss=1.31, generator_loss=30.68, generator_mel_loss=22.63, generator_kl_loss=1.962, generator_dur_loss=1.512, generator_adv_loss=1.85, generator_feat_match_loss=2.721, over 126.00 samples.], tot_loss[discriminator_loss=2.709, discriminator_real_loss=1.381, discriminator_fake_loss=1.328, generator_loss=30.15, generator_mel_loss=22.14, generator_kl_loss=1.9, generator_dur_loss=1.526, generator_adv_loss=1.917, generator_feat_match_loss=2.668, over 1641.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 20:56:14,549 INFO [train.py:845] (3/4) Start epoch 192 2024-02-22 20:59:26,504 INFO [train.py:471] (3/4) Epoch 192, batch 33, global_batch_idx: 7100, batch size: 55, loss[discriminator_loss=2.723, discriminator_real_loss=1.361, discriminator_fake_loss=1.361, generator_loss=29.68, generator_mel_loss=21.97, generator_kl_loss=1.963, generator_dur_loss=1.526, generator_adv_loss=1.771, generator_feat_match_loss=2.445, over 55.00 samples.], tot_loss[discriminator_loss=2.71, discriminator_real_loss=1.377, discriminator_fake_loss=1.333, generator_loss=29.88, generator_mel_loss=21.92, generator_kl_loss=1.91, generator_dur_loss=1.525, generator_adv_loss=1.911, generator_feat_match_loss=2.618, over 2478.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 20:59:42,952 INFO [train.py:845] (3/4) Start epoch 193 2024-02-22 21:03:02,656 INFO [train.py:845] (3/4) Start epoch 194 2024-02-22 21:03:57,654 INFO [train.py:471] (3/4) Epoch 194, batch 9, global_batch_idx: 7150, batch size: 53, loss[discriminator_loss=2.717, discriminator_real_loss=1.561, discriminator_fake_loss=1.156, generator_loss=29.04, generator_mel_loss=21.27, generator_kl_loss=1.912, generator_dur_loss=1.528, generator_adv_loss=1.666, generator_feat_match_loss=2.67, over 53.00 samples.], tot_loss[discriminator_loss=2.689, discriminator_real_loss=1.371, discriminator_fake_loss=1.319, generator_loss=29.92, generator_mel_loss=21.94, generator_kl_loss=1.878, generator_dur_loss=1.537, generator_adv_loss=1.89, generator_feat_match_loss=2.678, over 567.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:06:24,719 INFO [train.py:845] (3/4) Start epoch 195 2024-02-22 21:08:23,955 INFO [train.py:471] (3/4) Epoch 195, batch 22, global_batch_idx: 7200, batch size: 64, loss[discriminator_loss=2.711, discriminator_real_loss=1.487, discriminator_fake_loss=1.223, generator_loss=30.06, generator_mel_loss=21.95, generator_kl_loss=1.918, generator_dur_loss=1.517, generator_adv_loss=1.918, generator_feat_match_loss=2.756, over 64.00 samples.], tot_loss[discriminator_loss=2.727, discriminator_real_loss=1.385, discriminator_fake_loss=1.342, generator_loss=30, generator_mel_loss=21.97, generator_kl_loss=1.904, generator_dur_loss=1.529, generator_adv_loss=1.933, generator_feat_match_loss=2.659, over 1589.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:08:23,956 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 21:08:31,777 INFO [train.py:534] (3/4) Epoch 195, validation: discriminator_loss=2.647, discriminator_real_loss=1.321, discriminator_fake_loss=1.326, generator_loss=31.38, generator_mel_loss=23.32, generator_kl_loss=1.954, generator_dur_loss=1.517, generator_adv_loss=1.801, generator_feat_match_loss=2.783, over 100.00 samples. 2024-02-22 21:08:31,778 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 21:09:54,109 INFO [train.py:845] (3/4) Start epoch 196 2024-02-22 21:13:15,875 INFO [train.py:471] (3/4) Epoch 196, batch 35, global_batch_idx: 7250, batch size: 60, loss[discriminator_loss=2.719, discriminator_real_loss=1.267, discriminator_fake_loss=1.452, generator_loss=30.72, generator_mel_loss=22.63, generator_kl_loss=1.898, generator_dur_loss=1.539, generator_adv_loss=1.926, generator_feat_match_loss=2.719, over 60.00 samples.], tot_loss[discriminator_loss=2.717, discriminator_real_loss=1.385, discriminator_fake_loss=1.332, generator_loss=30.07, generator_mel_loss=22.08, generator_kl_loss=1.902, generator_dur_loss=1.526, generator_adv_loss=1.924, generator_feat_match_loss=2.643, over 2665.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:13:20,879 INFO [train.py:845] (3/4) Start epoch 197 2024-02-22 21:16:49,133 INFO [train.py:845] (3/4) Start epoch 198 2024-02-22 21:18:02,223 INFO [train.py:471] (3/4) Epoch 198, batch 11, global_batch_idx: 7300, batch size: 76, loss[discriminator_loss=2.707, discriminator_real_loss=1.211, discriminator_fake_loss=1.495, generator_loss=30.74, generator_mel_loss=22.49, generator_kl_loss=1.909, generator_dur_loss=1.518, generator_adv_loss=2.012, generator_feat_match_loss=2.812, over 76.00 samples.], tot_loss[discriminator_loss=2.703, discriminator_real_loss=1.357, discriminator_fake_loss=1.345, generator_loss=30.06, generator_mel_loss=21.97, generator_kl_loss=1.92, generator_dur_loss=1.523, generator_adv_loss=1.951, generator_feat_match_loss=2.697, over 848.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:20:14,193 INFO [train.py:845] (3/4) Start epoch 199 2024-02-22 21:22:39,864 INFO [train.py:471] (3/4) Epoch 199, batch 24, global_batch_idx: 7350, batch size: 79, loss[discriminator_loss=2.721, discriminator_real_loss=1.273, discriminator_fake_loss=1.447, generator_loss=30.86, generator_mel_loss=22.57, generator_kl_loss=1.958, generator_dur_loss=1.555, generator_adv_loss=1.977, generator_feat_match_loss=2.793, over 79.00 samples.], tot_loss[discriminator_loss=2.726, discriminator_real_loss=1.393, discriminator_fake_loss=1.333, generator_loss=30.04, generator_mel_loss=22.01, generator_kl_loss=1.924, generator_dur_loss=1.526, generator_adv_loss=1.924, generator_feat_match_loss=2.655, over 1824.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:23:45,336 INFO [train.py:845] (3/4) Start epoch 200 2024-02-22 21:27:14,715 INFO [train.py:845] (3/4) Start epoch 201 2024-02-22 21:27:29,795 INFO [train.py:471] (3/4) Epoch 201, batch 0, global_batch_idx: 7400, batch size: 54, loss[discriminator_loss=2.717, discriminator_real_loss=1.366, discriminator_fake_loss=1.351, generator_loss=29.33, generator_mel_loss=21.41, generator_kl_loss=1.967, generator_dur_loss=1.535, generator_adv_loss=1.957, generator_feat_match_loss=2.457, over 54.00 samples.], tot_loss[discriminator_loss=2.717, discriminator_real_loss=1.366, discriminator_fake_loss=1.351, generator_loss=29.33, generator_mel_loss=21.41, generator_kl_loss=1.967, generator_dur_loss=1.535, generator_adv_loss=1.957, generator_feat_match_loss=2.457, over 54.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:27:29,796 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 21:27:38,810 INFO [train.py:534] (3/4) Epoch 201, validation: discriminator_loss=2.61, discriminator_real_loss=1.297, discriminator_fake_loss=1.313, generator_loss=30.93, generator_mel_loss=22.73, generator_kl_loss=2.003, generator_dur_loss=1.521, generator_adv_loss=1.826, generator_feat_match_loss=2.848, over 100.00 samples. 2024-02-22 21:27:38,811 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 21:30:58,156 INFO [train.py:845] (3/4) Start epoch 202 2024-02-22 21:32:23,990 INFO [train.py:471] (3/4) Epoch 202, batch 13, global_batch_idx: 7450, batch size: 49, loss[discriminator_loss=2.754, discriminator_real_loss=1.459, discriminator_fake_loss=1.295, generator_loss=29.46, generator_mel_loss=21.77, generator_kl_loss=1.81, generator_dur_loss=1.522, generator_adv_loss=1.939, generator_feat_match_loss=2.414, over 49.00 samples.], tot_loss[discriminator_loss=2.695, discriminator_real_loss=1.367, discriminator_fake_loss=1.328, generator_loss=29.99, generator_mel_loss=21.92, generator_kl_loss=1.912, generator_dur_loss=1.526, generator_adv_loss=1.945, generator_feat_match_loss=2.686, over 880.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:34:29,360 INFO [train.py:845] (3/4) Start epoch 203 2024-02-22 21:36:57,121 INFO [train.py:471] (3/4) Epoch 203, batch 26, global_batch_idx: 7500, batch size: 56, loss[discriminator_loss=2.689, discriminator_real_loss=1.441, discriminator_fake_loss=1.248, generator_loss=30.07, generator_mel_loss=22.12, generator_kl_loss=1.938, generator_dur_loss=1.537, generator_adv_loss=1.785, generator_feat_match_loss=2.684, over 56.00 samples.], tot_loss[discriminator_loss=2.746, discriminator_real_loss=1.402, discriminator_fake_loss=1.344, generator_loss=29.94, generator_mel_loss=21.93, generator_kl_loss=1.902, generator_dur_loss=1.521, generator_adv_loss=1.925, generator_feat_match_loss=2.655, over 2010.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:37:57,029 INFO [train.py:845] (3/4) Start epoch 204 2024-02-22 21:41:31,553 INFO [train.py:845] (3/4) Start epoch 205 2024-02-22 21:41:55,792 INFO [train.py:471] (3/4) Epoch 205, batch 2, global_batch_idx: 7550, batch size: 101, loss[discriminator_loss=2.695, discriminator_real_loss=1.449, discriminator_fake_loss=1.246, generator_loss=30.29, generator_mel_loss=22.17, generator_kl_loss=1.999, generator_dur_loss=1.511, generator_adv_loss=1.883, generator_feat_match_loss=2.723, over 101.00 samples.], tot_loss[discriminator_loss=2.726, discriminator_real_loss=1.385, discriminator_fake_loss=1.341, generator_loss=29.68, generator_mel_loss=21.79, generator_kl_loss=1.917, generator_dur_loss=1.518, generator_adv_loss=1.89, generator_feat_match_loss=2.564, over 217.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:44:54,709 INFO [train.py:845] (3/4) Start epoch 206 2024-02-22 21:46:27,684 INFO [train.py:471] (3/4) Epoch 206, batch 15, global_batch_idx: 7600, batch size: 110, loss[discriminator_loss=2.711, discriminator_real_loss=1.496, discriminator_fake_loss=1.215, generator_loss=30.07, generator_mel_loss=22.23, generator_kl_loss=2.011, generator_dur_loss=1.516, generator_adv_loss=1.655, generator_feat_match_loss=2.66, over 110.00 samples.], tot_loss[discriminator_loss=2.727, discriminator_real_loss=1.385, discriminator_fake_loss=1.342, generator_loss=30.15, generator_mel_loss=22.11, generator_kl_loss=1.901, generator_dur_loss=1.522, generator_adv_loss=1.906, generator_feat_match_loss=2.71, over 1226.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 256.0 2024-02-22 21:46:27,685 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 21:46:36,592 INFO [train.py:534] (3/4) Epoch 206, validation: discriminator_loss=2.762, discriminator_real_loss=1.162, discriminator_fake_loss=1.6, generator_loss=30.51, generator_mel_loss=22.59, generator_kl_loss=1.973, generator_dur_loss=1.512, generator_adv_loss=1.587, generator_feat_match_loss=2.845, over 100.00 samples. 2024-02-22 21:46:36,593 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 21:48:34,000 INFO [train.py:845] (3/4) Start epoch 207 2024-02-22 21:51:21,119 INFO [train.py:471] (3/4) Epoch 207, batch 28, global_batch_idx: 7650, batch size: 126, loss[discriminator_loss=2.719, discriminator_real_loss=1.385, discriminator_fake_loss=1.335, generator_loss=30.08, generator_mel_loss=21.89, generator_kl_loss=1.984, generator_dur_loss=1.497, generator_adv_loss=1.938, generator_feat_match_loss=2.773, over 126.00 samples.], tot_loss[discriminator_loss=2.734, discriminator_real_loss=1.384, discriminator_fake_loss=1.349, generator_loss=30.07, generator_mel_loss=21.94, generator_kl_loss=1.906, generator_dur_loss=1.521, generator_adv_loss=1.976, generator_feat_match_loss=2.726, over 2235.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 128.0 2024-02-22 21:52:02,699 INFO [train.py:845] (3/4) Start epoch 208 2024-02-22 21:55:27,981 INFO [train.py:845] (3/4) Start epoch 209 2024-02-22 21:56:00,318 INFO [train.py:471] (3/4) Epoch 209, batch 4, global_batch_idx: 7700, batch size: 65, loss[discriminator_loss=2.766, discriminator_real_loss=1.144, discriminator_fake_loss=1.622, generator_loss=30.14, generator_mel_loss=21.77, generator_kl_loss=1.887, generator_dur_loss=1.553, generator_adv_loss=2.203, generator_feat_match_loss=2.729, over 65.00 samples.], tot_loss[discriminator_loss=2.682, discriminator_real_loss=1.347, discriminator_fake_loss=1.335, generator_loss=29.74, generator_mel_loss=21.63, generator_kl_loss=1.874, generator_dur_loss=1.539, generator_adv_loss=1.962, generator_feat_match_loss=2.739, over 321.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 128.0 2024-02-22 21:58:57,932 INFO [train.py:845] (3/4) Start epoch 210 2024-02-22 22:00:47,520 INFO [train.py:471] (3/4) Epoch 210, batch 17, global_batch_idx: 7750, batch size: 65, loss[discriminator_loss=2.707, discriminator_real_loss=1.413, discriminator_fake_loss=1.295, generator_loss=29.67, generator_mel_loss=21.66, generator_kl_loss=1.835, generator_dur_loss=1.528, generator_adv_loss=1.837, generator_feat_match_loss=2.812, over 65.00 samples.], tot_loss[discriminator_loss=2.711, discriminator_real_loss=1.378, discriminator_fake_loss=1.333, generator_loss=29.85, generator_mel_loss=21.79, generator_kl_loss=1.917, generator_dur_loss=1.519, generator_adv_loss=1.925, generator_feat_match_loss=2.704, over 1308.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 128.0 2024-02-22 22:02:33,481 INFO [train.py:845] (3/4) Start epoch 211 2024-02-22 22:05:39,890 INFO [train.py:471] (3/4) Epoch 211, batch 30, global_batch_idx: 7800, batch size: 50, loss[discriminator_loss=2.703, discriminator_real_loss=1.323, discriminator_fake_loss=1.381, generator_loss=29.61, generator_mel_loss=21.6, generator_kl_loss=1.843, generator_dur_loss=1.512, generator_adv_loss=2.059, generator_feat_match_loss=2.59, over 50.00 samples.], tot_loss[discriminator_loss=2.718, discriminator_real_loss=1.391, discriminator_fake_loss=1.327, generator_loss=29.99, generator_mel_loss=21.93, generator_kl_loss=1.911, generator_dur_loss=1.518, generator_adv_loss=1.934, generator_feat_match_loss=2.691, over 2199.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 128.0 2024-02-22 22:05:39,892 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 22:05:48,287 INFO [train.py:534] (3/4) Epoch 211, validation: discriminator_loss=2.691, discriminator_real_loss=1.395, discriminator_fake_loss=1.297, generator_loss=30.81, generator_mel_loss=22.59, generator_kl_loss=1.987, generator_dur_loss=1.519, generator_adv_loss=1.979, generator_feat_match_loss=2.739, over 100.00 samples. 2024-02-22 22:05:48,288 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 22:06:15,370 INFO [train.py:845] (3/4) Start epoch 212 2024-02-22 22:09:34,293 INFO [train.py:845] (3/4) Start epoch 213 2024-02-22 22:10:23,298 INFO [train.py:471] (3/4) Epoch 213, batch 6, global_batch_idx: 7850, batch size: 76, loss[discriminator_loss=2.676, discriminator_real_loss=1.438, discriminator_fake_loss=1.238, generator_loss=29.68, generator_mel_loss=21.67, generator_kl_loss=1.874, generator_dur_loss=1.547, generator_adv_loss=1.857, generator_feat_match_loss=2.732, over 76.00 samples.], tot_loss[discriminator_loss=2.678, discriminator_real_loss=1.362, discriminator_fake_loss=1.317, generator_loss=29.97, generator_mel_loss=21.92, generator_kl_loss=1.892, generator_dur_loss=1.536, generator_adv_loss=1.912, generator_feat_match_loss=2.707, over 488.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 128.0 2024-02-22 22:12:59,305 INFO [train.py:845] (3/4) Start epoch 214 2024-02-22 22:14:52,785 INFO [train.py:471] (3/4) Epoch 214, batch 19, global_batch_idx: 7900, batch size: 154, loss[discriminator_loss=2.738, discriminator_real_loss=1.339, discriminator_fake_loss=1.4, generator_loss=30.45, generator_mel_loss=22.23, generator_kl_loss=1.943, generator_dur_loss=1.503, generator_adv_loss=2.002, generator_feat_match_loss=2.775, over 154.00 samples.], tot_loss[discriminator_loss=2.713, discriminator_real_loss=1.374, discriminator_fake_loss=1.339, generator_loss=29.94, generator_mel_loss=21.86, generator_kl_loss=1.898, generator_dur_loss=1.523, generator_adv_loss=1.946, generator_feat_match_loss=2.709, over 1423.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 128.0 2024-02-22 22:16:30,098 INFO [train.py:845] (3/4) Start epoch 215 2024-02-22 22:19:36,048 INFO [train.py:471] (3/4) Epoch 215, batch 32, global_batch_idx: 7950, batch size: 101, loss[discriminator_loss=2.723, discriminator_real_loss=1.25, discriminator_fake_loss=1.474, generator_loss=30.83, generator_mel_loss=22.26, generator_kl_loss=1.933, generator_dur_loss=1.51, generator_adv_loss=2.311, generator_feat_match_loss=2.809, over 101.00 samples.], tot_loss[discriminator_loss=2.721, discriminator_real_loss=1.375, discriminator_fake_loss=1.345, generator_loss=29.94, generator_mel_loss=21.88, generator_kl_loss=1.91, generator_dur_loss=1.522, generator_adv_loss=1.926, generator_feat_match_loss=2.699, over 2406.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 128.0 2024-02-22 22:19:59,410 INFO [train.py:845] (3/4) Start epoch 216 2024-02-22 22:23:30,101 INFO [train.py:845] (3/4) Start epoch 217 2024-02-22 22:24:32,868 INFO [train.py:471] (3/4) Epoch 217, batch 8, global_batch_idx: 8000, batch size: 69, loss[discriminator_loss=2.688, discriminator_real_loss=1.411, discriminator_fake_loss=1.275, generator_loss=29.65, generator_mel_loss=21.63, generator_kl_loss=1.905, generator_dur_loss=1.522, generator_adv_loss=1.889, generator_feat_match_loss=2.705, over 69.00 samples.], tot_loss[discriminator_loss=2.7, discriminator_real_loss=1.36, discriminator_fake_loss=1.34, generator_loss=30.22, generator_mel_loss=22.07, generator_kl_loss=1.921, generator_dur_loss=1.508, generator_adv_loss=1.922, generator_feat_match_loss=2.798, over 854.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 128.0 2024-02-22 22:24:32,870 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 22:24:40,503 INFO [train.py:534] (3/4) Epoch 217, validation: discriminator_loss=2.635, discriminator_real_loss=1.275, discriminator_fake_loss=1.36, generator_loss=30.71, generator_mel_loss=22.59, generator_kl_loss=1.966, generator_dur_loss=1.511, generator_adv_loss=1.781, generator_feat_match_loss=2.86, over 100.00 samples. 2024-02-22 22:24:40,503 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 22:27:06,517 INFO [train.py:845] (3/4) Start epoch 218 2024-02-22 22:29:07,952 INFO [train.py:471] (3/4) Epoch 218, batch 21, global_batch_idx: 8050, batch size: 85, loss[discriminator_loss=2.654, discriminator_real_loss=1.357, discriminator_fake_loss=1.297, generator_loss=30.19, generator_mel_loss=21.95, generator_kl_loss=1.908, generator_dur_loss=1.528, generator_adv_loss=2.029, generator_feat_match_loss=2.781, over 85.00 samples.], tot_loss[discriminator_loss=2.716, discriminator_real_loss=1.369, discriminator_fake_loss=1.347, generator_loss=29.8, generator_mel_loss=21.78, generator_kl_loss=1.895, generator_dur_loss=1.522, generator_adv_loss=1.909, generator_feat_match_loss=2.695, over 1544.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 64.0 2024-02-22 22:30:26,895 INFO [train.py:845] (3/4) Start epoch 219 2024-02-22 22:33:44,190 INFO [train.py:471] (3/4) Epoch 219, batch 34, global_batch_idx: 8100, batch size: 69, loss[discriminator_loss=2.598, discriminator_real_loss=1.271, discriminator_fake_loss=1.325, generator_loss=30.11, generator_mel_loss=21.85, generator_kl_loss=1.872, generator_dur_loss=1.512, generator_adv_loss=1.863, generator_feat_match_loss=3.01, over 69.00 samples.], tot_loss[discriminator_loss=2.713, discriminator_real_loss=1.377, discriminator_fake_loss=1.336, generator_loss=30.03, generator_mel_loss=21.8, generator_kl_loss=1.902, generator_dur_loss=1.522, generator_adv_loss=1.996, generator_feat_match_loss=2.812, over 2412.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 32.0 2024-02-22 22:33:52,704 INFO [train.py:845] (3/4) Start epoch 220 2024-02-22 22:37:17,943 INFO [train.py:845] (3/4) Start epoch 221 2024-02-22 22:38:29,870 INFO [train.py:471] (3/4) Epoch 221, batch 10, global_batch_idx: 8150, batch size: 55, loss[discriminator_loss=3.172, discriminator_real_loss=1.401, discriminator_fake_loss=1.771, generator_loss=30.61, generator_mel_loss=21.69, generator_kl_loss=1.866, generator_dur_loss=1.56, generator_adv_loss=2.953, generator_feat_match_loss=2.535, over 55.00 samples.], tot_loss[discriminator_loss=2.717, discriminator_real_loss=1.341, discriminator_fake_loss=1.376, generator_loss=31.32, generator_mel_loss=22.33, generator_kl_loss=1.887, generator_dur_loss=1.513, generator_adv_loss=2.323, generator_feat_match_loss=3.267, over 906.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2024-02-22 22:40:45,412 INFO [train.py:845] (3/4) Start epoch 222 2024-02-22 22:43:06,510 INFO [train.py:471] (3/4) Epoch 222, batch 23, global_batch_idx: 8200, batch size: 52, loss[discriminator_loss=2.727, discriminator_real_loss=1.29, discriminator_fake_loss=1.436, generator_loss=30.38, generator_mel_loss=21.52, generator_kl_loss=1.852, generator_dur_loss=1.531, generator_adv_loss=2.242, generator_feat_match_loss=3.234, over 52.00 samples.], tot_loss[discriminator_loss=2.663, discriminator_real_loss=1.348, discriminator_fake_loss=1.314, generator_loss=30.11, generator_mel_loss=21.79, generator_kl_loss=1.887, generator_dur_loss=1.519, generator_adv_loss=2.033, generator_feat_match_loss=2.878, over 1696.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2024-02-22 22:43:06,511 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 22:43:15,099 INFO [train.py:534] (3/4) Epoch 222, validation: discriminator_loss=2.618, discriminator_real_loss=1.228, discriminator_fake_loss=1.39, generator_loss=31.04, generator_mel_loss=22.83, generator_kl_loss=2.01, generator_dur_loss=1.511, generator_adv_loss=1.779, generator_feat_match_loss=2.91, over 100.00 samples. 2024-02-22 22:43:15,101 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 22:44:28,036 INFO [train.py:845] (3/4) Start epoch 223 2024-02-22 22:47:57,600 INFO [train.py:471] (3/4) Epoch 223, batch 36, global_batch_idx: 8250, batch size: 126, loss[discriminator_loss=2.678, discriminator_real_loss=1.391, discriminator_fake_loss=1.287, generator_loss=30.42, generator_mel_loss=21.91, generator_kl_loss=1.939, generator_dur_loss=1.503, generator_adv_loss=2.137, generator_feat_match_loss=2.922, over 126.00 samples.], tot_loss[discriminator_loss=2.694, discriminator_real_loss=1.364, discriminator_fake_loss=1.33, generator_loss=30.08, generator_mel_loss=21.75, generator_kl_loss=1.9, generator_dur_loss=1.52, generator_adv_loss=2.061, generator_feat_match_loss=2.857, over 2519.00 samples.], cur_lr_g: 1.95e-04, cur_lr_d: 1.95e-04, grad_scale: 16.0 2024-02-22 22:47:58,067 INFO [train.py:845] (3/4) Start epoch 224 2024-02-22 22:51:32,039 INFO [train.py:845] (3/4) Start epoch 225 2024-02-22 22:52:48,987 INFO [train.py:471] (3/4) Epoch 225, batch 12, global_batch_idx: 8300, batch size: 126, loss[discriminator_loss=2.75, discriminator_real_loss=1.511, discriminator_fake_loss=1.24, generator_loss=30.1, generator_mel_loss=21.83, generator_kl_loss=1.889, generator_dur_loss=1.487, generator_adv_loss=1.955, generator_feat_match_loss=2.936, over 126.00 samples.], tot_loss[discriminator_loss=2.657, discriminator_real_loss=1.34, discriminator_fake_loss=1.317, generator_loss=30.21, generator_mel_loss=21.8, generator_kl_loss=1.901, generator_dur_loss=1.514, generator_adv_loss=2.037, generator_feat_match_loss=2.959, over 943.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 22:55:00,818 INFO [train.py:845] (3/4) Start epoch 226 2024-02-22 22:57:32,482 INFO [train.py:471] (3/4) Epoch 226, batch 25, global_batch_idx: 8350, batch size: 60, loss[discriminator_loss=2.617, discriminator_real_loss=1.299, discriminator_fake_loss=1.317, generator_loss=31.05, generator_mel_loss=22.07, generator_kl_loss=1.816, generator_dur_loss=1.53, generator_adv_loss=2.053, generator_feat_match_loss=3.586, over 60.00 samples.], tot_loss[discriminator_loss=2.756, discriminator_real_loss=1.443, discriminator_fake_loss=1.313, generator_loss=30.41, generator_mel_loss=21.73, generator_kl_loss=1.91, generator_dur_loss=1.517, generator_adv_loss=2.153, generator_feat_match_loss=3.104, over 1888.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 22:58:32,109 INFO [train.py:845] (3/4) Start epoch 227 2024-02-22 23:02:04,966 INFO [train.py:845] (3/4) Start epoch 228 2024-02-22 23:02:23,889 INFO [train.py:471] (3/4) Epoch 228, batch 1, global_batch_idx: 8400, batch size: 52, loss[discriminator_loss=2.703, discriminator_real_loss=1.321, discriminator_fake_loss=1.383, generator_loss=30.58, generator_mel_loss=21.78, generator_kl_loss=1.82, generator_dur_loss=1.551, generator_adv_loss=2.162, generator_feat_match_loss=3.268, over 52.00 samples.], tot_loss[discriminator_loss=2.644, discriminator_real_loss=1.347, discriminator_fake_loss=1.298, generator_loss=30.9, generator_mel_loss=21.73, generator_kl_loss=1.902, generator_dur_loss=1.534, generator_adv_loss=2.293, generator_feat_match_loss=3.45, over 108.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 32.0 2024-02-22 23:02:23,890 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 23:02:32,682 INFO [train.py:534] (3/4) Epoch 228, validation: discriminator_loss=2.758, discriminator_real_loss=1.256, discriminator_fake_loss=1.502, generator_loss=30.72, generator_mel_loss=22.26, generator_kl_loss=2.084, generator_dur_loss=1.504, generator_adv_loss=1.804, generator_feat_match_loss=3.063, over 100.00 samples. 2024-02-22 23:02:32,683 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28057MB 2024-02-22 23:05:44,919 INFO [train.py:845] (3/4) Start epoch 229 2024-02-22 23:07:01,729 INFO [train.py:471] (3/4) Epoch 229, batch 14, global_batch_idx: 8450, batch size: 67, loss[discriminator_loss=2.57, discriminator_real_loss=1.405, discriminator_fake_loss=1.166, generator_loss=30.17, generator_mel_loss=21.45, generator_kl_loss=1.973, generator_dur_loss=1.545, generator_adv_loss=2.236, generator_feat_match_loss=2.973, over 67.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.323, discriminator_fake_loss=1.3, generator_loss=29.95, generator_mel_loss=21.61, generator_kl_loss=1.884, generator_dur_loss=1.528, generator_adv_loss=2.016, generator_feat_match_loss=2.911, over 1008.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:09:08,804 INFO [train.py:845] (3/4) Start epoch 230 2024-02-22 23:11:47,837 INFO [train.py:471] (3/4) Epoch 230, batch 27, global_batch_idx: 8500, batch size: 71, loss[discriminator_loss=2.803, discriminator_real_loss=1.236, discriminator_fake_loss=1.566, generator_loss=29.41, generator_mel_loss=21.61, generator_kl_loss=1.793, generator_dur_loss=1.502, generator_adv_loss=1.897, generator_feat_match_loss=2.613, over 71.00 samples.], tot_loss[discriminator_loss=2.62, discriminator_real_loss=1.321, discriminator_fake_loss=1.299, generator_loss=30.56, generator_mel_loss=21.61, generator_kl_loss=1.885, generator_dur_loss=1.513, generator_adv_loss=2.224, generator_feat_match_loss=3.324, over 2163.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:12:39,931 INFO [train.py:845] (3/4) Start epoch 231 2024-02-22 23:16:09,073 INFO [train.py:845] (3/4) Start epoch 232 2024-02-22 23:16:38,297 INFO [train.py:471] (3/4) Epoch 232, batch 3, global_batch_idx: 8550, batch size: 52, loss[discriminator_loss=2.662, discriminator_real_loss=1.389, discriminator_fake_loss=1.273, generator_loss=29.84, generator_mel_loss=20.92, generator_kl_loss=1.885, generator_dur_loss=1.512, generator_adv_loss=2.291, generator_feat_match_loss=3.234, over 52.00 samples.], tot_loss[discriminator_loss=2.558, discriminator_real_loss=1.273, discriminator_fake_loss=1.285, generator_loss=30.2, generator_mel_loss=21.32, generator_kl_loss=1.877, generator_dur_loss=1.513, generator_adv_loss=2.257, generator_feat_match_loss=3.235, over 237.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:19:41,380 INFO [train.py:845] (3/4) Start epoch 233 2024-02-22 23:21:25,315 INFO [train.py:471] (3/4) Epoch 233, batch 16, global_batch_idx: 8600, batch size: 49, loss[discriminator_loss=2.441, discriminator_real_loss=1.267, discriminator_fake_loss=1.176, generator_loss=30.61, generator_mel_loss=21.14, generator_kl_loss=1.838, generator_dur_loss=1.521, generator_adv_loss=2.383, generator_feat_match_loss=3.723, over 49.00 samples.], tot_loss[discriminator_loss=2.51, discriminator_real_loss=1.273, discriminator_fake_loss=1.238, generator_loss=30.84, generator_mel_loss=21.45, generator_kl_loss=1.866, generator_dur_loss=1.517, generator_adv_loss=2.312, generator_feat_match_loss=3.698, over 1299.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:21:25,316 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 23:21:33,521 INFO [train.py:534] (3/4) Epoch 233, validation: discriminator_loss=2.418, discriminator_real_loss=1.299, discriminator_fake_loss=1.119, generator_loss=32.3, generator_mel_loss=22.62, generator_kl_loss=1.961, generator_dur_loss=1.516, generator_adv_loss=2.35, generator_feat_match_loss=3.86, over 100.00 samples. 2024-02-22 23:21:33,522 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28061MB 2024-02-22 23:23:14,694 INFO [train.py:845] (3/4) Start epoch 234 2024-02-22 23:26:07,397 INFO [train.py:471] (3/4) Epoch 234, batch 29, global_batch_idx: 8650, batch size: 58, loss[discriminator_loss=2.621, discriminator_real_loss=1.16, discriminator_fake_loss=1.462, generator_loss=29.95, generator_mel_loss=20.73, generator_kl_loss=1.889, generator_dur_loss=1.499, generator_adv_loss=2.416, generator_feat_match_loss=3.408, over 58.00 samples.], tot_loss[discriminator_loss=2.579, discriminator_real_loss=1.29, discriminator_fake_loss=1.289, generator_loss=30.73, generator_mel_loss=21.37, generator_kl_loss=1.885, generator_dur_loss=1.514, generator_adv_loss=2.355, generator_feat_match_loss=3.598, over 2221.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:26:45,439 INFO [train.py:845] (3/4) Start epoch 235 2024-02-22 23:30:11,239 INFO [train.py:845] (3/4) Start epoch 236 2024-02-22 23:30:54,452 INFO [train.py:471] (3/4) Epoch 236, batch 5, global_batch_idx: 8700, batch size: 67, loss[discriminator_loss=2.48, discriminator_real_loss=1.207, discriminator_fake_loss=1.273, generator_loss=31.08, generator_mel_loss=21.45, generator_kl_loss=1.961, generator_dur_loss=1.527, generator_adv_loss=2.434, generator_feat_match_loss=3.713, over 67.00 samples.], tot_loss[discriminator_loss=2.427, discriminator_real_loss=1.206, discriminator_fake_loss=1.221, generator_loss=30.78, generator_mel_loss=21.33, generator_kl_loss=1.941, generator_dur_loss=1.521, generator_adv_loss=2.33, generator_feat_match_loss=3.66, over 421.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:33:38,450 INFO [train.py:845] (3/4) Start epoch 237 2024-02-22 23:35:24,796 INFO [train.py:471] (3/4) Epoch 237, batch 18, global_batch_idx: 8750, batch size: 85, loss[discriminator_loss=2.738, discriminator_real_loss=1.215, discriminator_fake_loss=1.524, generator_loss=29.77, generator_mel_loss=21.49, generator_kl_loss=1.87, generator_dur_loss=1.513, generator_adv_loss=1.935, generator_feat_match_loss=2.967, over 85.00 samples.], tot_loss[discriminator_loss=2.612, discriminator_real_loss=1.309, discriminator_fake_loss=1.303, generator_loss=30.01, generator_mel_loss=21.43, generator_kl_loss=1.938, generator_dur_loss=1.513, generator_adv_loss=2.067, generator_feat_match_loss=3.068, over 1431.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:37:06,660 INFO [train.py:845] (3/4) Start epoch 238 2024-02-22 23:40:05,517 INFO [train.py:471] (3/4) Epoch 238, batch 31, global_batch_idx: 8800, batch size: 69, loss[discriminator_loss=2.602, discriminator_real_loss=1.386, discriminator_fake_loss=1.217, generator_loss=29.68, generator_mel_loss=21.11, generator_kl_loss=1.855, generator_dur_loss=1.513, generator_adv_loss=2.164, generator_feat_match_loss=3.039, over 69.00 samples.], tot_loss[discriminator_loss=2.498, discriminator_real_loss=1.264, discriminator_fake_loss=1.235, generator_loss=30.85, generator_mel_loss=21.35, generator_kl_loss=1.92, generator_dur_loss=1.516, generator_adv_loss=2.369, generator_feat_match_loss=3.696, over 2435.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 32.0 2024-02-22 23:40:05,519 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 23:40:14,246 INFO [train.py:534] (3/4) Epoch 238, validation: discriminator_loss=2.484, discriminator_real_loss=1.216, discriminator_fake_loss=1.268, generator_loss=31.88, generator_mel_loss=22.77, generator_kl_loss=1.983, generator_dur_loss=1.509, generator_adv_loss=2.21, generator_feat_match_loss=3.41, over 100.00 samples. 2024-02-22 23:40:14,246 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28061MB 2024-02-22 23:40:46,761 INFO [train.py:845] (3/4) Start epoch 239 2024-02-22 23:44:17,070 INFO [train.py:845] (3/4) Start epoch 240 2024-02-22 23:45:12,703 INFO [train.py:471] (3/4) Epoch 240, batch 7, global_batch_idx: 8850, batch size: 71, loss[discriminator_loss=2.648, discriminator_real_loss=1.377, discriminator_fake_loss=1.271, generator_loss=29.67, generator_mel_loss=21.12, generator_kl_loss=1.893, generator_dur_loss=1.497, generator_adv_loss=2.232, generator_feat_match_loss=2.928, over 71.00 samples.], tot_loss[discriminator_loss=2.652, discriminator_real_loss=1.358, discriminator_fake_loss=1.294, generator_loss=30.02, generator_mel_loss=21.38, generator_kl_loss=1.918, generator_dur_loss=1.506, generator_adv_loss=2.117, generator_feat_match_loss=3.101, over 707.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:47:45,233 INFO [train.py:845] (3/4) Start epoch 241 2024-02-22 23:49:51,092 INFO [train.py:471] (3/4) Epoch 241, batch 20, global_batch_idx: 8900, batch size: 95, loss[discriminator_loss=4.062, discriminator_real_loss=2.02, discriminator_fake_loss=2.045, generator_loss=30.12, generator_mel_loss=21.12, generator_kl_loss=1.97, generator_dur_loss=1.485, generator_adv_loss=2.451, generator_feat_match_loss=3.09, over 95.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.335, discriminator_fake_loss=1.292, generator_loss=31.12, generator_mel_loss=21.32, generator_kl_loss=1.922, generator_dur_loss=1.511, generator_adv_loss=2.467, generator_feat_match_loss=3.905, over 1635.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:51:10,283 INFO [train.py:845] (3/4) Start epoch 242 2024-02-22 23:54:22,197 INFO [train.py:471] (3/4) Epoch 242, batch 33, global_batch_idx: 8950, batch size: 67, loss[discriminator_loss=3.176, discriminator_real_loss=1.415, discriminator_fake_loss=1.762, generator_loss=29.27, generator_mel_loss=21.81, generator_kl_loss=1.952, generator_dur_loss=1.521, generator_adv_loss=1.709, generator_feat_match_loss=2.273, over 67.00 samples.], tot_loss[discriminator_loss=2.55, discriminator_real_loss=1.283, discriminator_fake_loss=1.268, generator_loss=30.35, generator_mel_loss=21.1, generator_kl_loss=1.93, generator_dur_loss=1.51, generator_adv_loss=2.296, generator_feat_match_loss=3.522, over 2745.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:54:37,411 INFO [train.py:845] (3/4) Start epoch 243 2024-02-22 23:58:04,645 INFO [train.py:845] (3/4) Start epoch 244 2024-02-22 23:59:10,110 INFO [train.py:471] (3/4) Epoch 244, batch 9, global_batch_idx: 9000, batch size: 101, loss[discriminator_loss=2.285, discriminator_real_loss=1.063, discriminator_fake_loss=1.221, generator_loss=31.42, generator_mel_loss=21.18, generator_kl_loss=1.884, generator_dur_loss=1.51, generator_adv_loss=2.625, generator_feat_match_loss=4.219, over 101.00 samples.], tot_loss[discriminator_loss=2.339, discriminator_real_loss=1.166, discriminator_fake_loss=1.173, generator_loss=31.02, generator_mel_loss=21.04, generator_kl_loss=1.907, generator_dur_loss=1.51, generator_adv_loss=2.399, generator_feat_match_loss=4.17, over 807.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-22 23:59:10,112 INFO [train.py:525] (3/4) Computing validation loss 2024-02-22 23:59:18,558 INFO [train.py:534] (3/4) Epoch 244, validation: discriminator_loss=2.271, discriminator_real_loss=1.164, discriminator_fake_loss=1.107, generator_loss=32.59, generator_mel_loss=22.33, generator_kl_loss=2.112, generator_dur_loss=1.507, generator_adv_loss=2.449, generator_feat_match_loss=4.194, over 100.00 samples. 2024-02-22 23:59:18,559 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28061MB 2024-02-23 00:01:37,739 INFO [train.py:845] (3/4) Start epoch 245 2024-02-23 00:03:52,457 INFO [train.py:471] (3/4) Epoch 245, batch 22, global_batch_idx: 9050, batch size: 52, loss[discriminator_loss=2.422, discriminator_real_loss=1.315, discriminator_fake_loss=1.107, generator_loss=30.42, generator_mel_loss=20.52, generator_kl_loss=1.85, generator_dur_loss=1.519, generator_adv_loss=2.445, generator_feat_match_loss=4.078, over 52.00 samples.], tot_loss[discriminator_loss=2.506, discriminator_real_loss=1.275, discriminator_fake_loss=1.23, generator_loss=30.53, generator_mel_loss=21.03, generator_kl_loss=1.891, generator_dur_loss=1.51, generator_adv_loss=2.38, generator_feat_match_loss=3.72, over 1805.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-23 00:05:09,542 INFO [train.py:845] (3/4) Start epoch 246 2024-02-23 00:08:33,868 INFO [train.py:471] (3/4) Epoch 246, batch 35, global_batch_idx: 9100, batch size: 81, loss[discriminator_loss=2.625, discriminator_real_loss=1.216, discriminator_fake_loss=1.41, generator_loss=28.76, generator_mel_loss=20.1, generator_kl_loss=1.896, generator_dur_loss=1.5, generator_adv_loss=2.086, generator_feat_match_loss=3.178, over 81.00 samples.], tot_loss[discriminator_loss=2.466, discriminator_real_loss=1.244, discriminator_fake_loss=1.222, generator_loss=30.63, generator_mel_loss=21.02, generator_kl_loss=1.933, generator_dur_loss=1.508, generator_adv_loss=2.364, generator_feat_match_loss=3.804, over 2728.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-23 00:08:42,289 INFO [train.py:845] (3/4) Start epoch 247 2024-02-23 00:12:09,211 INFO [train.py:845] (3/4) Start epoch 248 2024-02-23 00:13:26,738 INFO [train.py:471] (3/4) Epoch 248, batch 11, global_batch_idx: 9150, batch size: 126, loss[discriminator_loss=2.568, discriminator_real_loss=1.414, discriminator_fake_loss=1.154, generator_loss=29.4, generator_mel_loss=20.63, generator_kl_loss=1.966, generator_dur_loss=1.532, generator_adv_loss=2.141, generator_feat_match_loss=3.129, over 126.00 samples.], tot_loss[discriminator_loss=2.528, discriminator_real_loss=1.271, discriminator_fake_loss=1.258, generator_loss=30.13, generator_mel_loss=21.09, generator_kl_loss=1.935, generator_dur_loss=1.517, generator_adv_loss=2.228, generator_feat_match_loss=3.358, over 1027.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-23 00:15:42,667 INFO [train.py:845] (3/4) Start epoch 249 2024-02-23 00:18:04,581 INFO [train.py:471] (3/4) Epoch 249, batch 24, global_batch_idx: 9200, batch size: 153, loss[discriminator_loss=2.379, discriminator_real_loss=1.042, discriminator_fake_loss=1.336, generator_loss=31.65, generator_mel_loss=21.25, generator_kl_loss=1.964, generator_dur_loss=1.468, generator_adv_loss=2.539, generator_feat_match_loss=4.43, over 153.00 samples.], tot_loss[discriminator_loss=2.387, discriminator_real_loss=1.196, discriminator_fake_loss=1.191, generator_loss=31.04, generator_mel_loss=21.04, generator_kl_loss=1.932, generator_dur_loss=1.509, generator_adv_loss=2.429, generator_feat_match_loss=4.131, over 2006.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 32.0 2024-02-23 00:18:04,583 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 00:18:12,560 INFO [train.py:534] (3/4) Epoch 249, validation: discriminator_loss=2.333, discriminator_real_loss=1.069, discriminator_fake_loss=1.264, generator_loss=32.12, generator_mel_loss=21.78, generator_kl_loss=2.062, generator_dur_loss=1.513, generator_adv_loss=2.397, generator_feat_match_loss=4.361, over 100.00 samples. 2024-02-23 00:18:12,561 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 00:19:14,186 INFO [train.py:845] (3/4) Start epoch 250 2024-02-23 00:22:39,209 INFO [train.py:845] (3/4) Start epoch 251 2024-02-23 00:22:52,289 INFO [train.py:471] (3/4) Epoch 251, batch 0, global_batch_idx: 9250, batch size: 90, loss[discriminator_loss=2.344, discriminator_real_loss=1.296, discriminator_fake_loss=1.048, generator_loss=30.59, generator_mel_loss=20.74, generator_kl_loss=1.857, generator_dur_loss=1.501, generator_adv_loss=2.293, generator_feat_match_loss=4.199, over 90.00 samples.], tot_loss[discriminator_loss=2.344, discriminator_real_loss=1.296, discriminator_fake_loss=1.048, generator_loss=30.59, generator_mel_loss=20.74, generator_kl_loss=1.857, generator_dur_loss=1.501, generator_adv_loss=2.293, generator_feat_match_loss=4.199, over 90.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-23 00:26:07,674 INFO [train.py:845] (3/4) Start epoch 252 2024-02-23 00:27:28,732 INFO [train.py:471] (3/4) Epoch 252, batch 13, global_batch_idx: 9300, batch size: 65, loss[discriminator_loss=2.305, discriminator_real_loss=1.132, discriminator_fake_loss=1.172, generator_loss=31.34, generator_mel_loss=21.36, generator_kl_loss=1.898, generator_dur_loss=1.495, generator_adv_loss=2.426, generator_feat_match_loss=4.168, over 65.00 samples.], tot_loss[discriminator_loss=2.387, discriminator_real_loss=1.183, discriminator_fake_loss=1.204, generator_loss=30.84, generator_mel_loss=21.14, generator_kl_loss=1.945, generator_dur_loss=1.519, generator_adv_loss=2.368, generator_feat_match_loss=3.874, over 1035.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 8.0 2024-02-23 00:29:35,998 INFO [train.py:845] (3/4) Start epoch 253 2024-02-23 00:32:22,517 INFO [train.py:471] (3/4) Epoch 253, batch 26, global_batch_idx: 9350, batch size: 110, loss[discriminator_loss=2.541, discriminator_real_loss=1.258, discriminator_fake_loss=1.283, generator_loss=29.51, generator_mel_loss=20.69, generator_kl_loss=1.936, generator_dur_loss=1.483, generator_adv_loss=2.238, generator_feat_match_loss=3.164, over 110.00 samples.], tot_loss[discriminator_loss=2.543, discriminator_real_loss=1.274, discriminator_fake_loss=1.268, generator_loss=30.45, generator_mel_loss=20.83, generator_kl_loss=1.914, generator_dur_loss=1.509, generator_adv_loss=2.375, generator_feat_match_loss=3.824, over 2038.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 8.0 2024-02-23 00:33:08,623 INFO [train.py:845] (3/4) Start epoch 254 2024-02-23 00:36:37,224 INFO [train.py:845] (3/4) Start epoch 255 2024-02-23 00:37:03,663 INFO [train.py:471] (3/4) Epoch 255, batch 2, global_batch_idx: 9400, batch size: 73, loss[discriminator_loss=2.453, discriminator_real_loss=1.202, discriminator_fake_loss=1.251, generator_loss=29.97, generator_mel_loss=20.78, generator_kl_loss=1.815, generator_dur_loss=1.496, generator_adv_loss=2.299, generator_feat_match_loss=3.582, over 73.00 samples.], tot_loss[discriminator_loss=2.435, discriminator_real_loss=1.176, discriminator_fake_loss=1.26, generator_loss=30.06, generator_mel_loss=20.75, generator_kl_loss=1.886, generator_dur_loss=1.494, generator_adv_loss=2.254, generator_feat_match_loss=3.669, over 293.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 8.0 2024-02-23 00:37:03,664 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 00:37:13,414 INFO [train.py:534] (3/4) Epoch 255, validation: discriminator_loss=2.391, discriminator_real_loss=1.144, discriminator_fake_loss=1.247, generator_loss=31.07, generator_mel_loss=21.7, generator_kl_loss=1.965, generator_dur_loss=1.499, generator_adv_loss=2.208, generator_feat_match_loss=3.697, over 100.00 samples. 2024-02-23 00:37:13,414 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 00:40:19,086 INFO [train.py:845] (3/4) Start epoch 256 2024-02-23 00:41:46,703 INFO [train.py:471] (3/4) Epoch 256, batch 15, global_batch_idx: 9450, batch size: 56, loss[discriminator_loss=2.438, discriminator_real_loss=1.159, discriminator_fake_loss=1.277, generator_loss=30.04, generator_mel_loss=20.52, generator_kl_loss=1.875, generator_dur_loss=1.521, generator_adv_loss=2.305, generator_feat_match_loss=3.818, over 56.00 samples.], tot_loss[discriminator_loss=2.457, discriminator_real_loss=1.25, discriminator_fake_loss=1.207, generator_loss=30.28, generator_mel_loss=20.64, generator_kl_loss=1.931, generator_dur_loss=1.519, generator_adv_loss=2.319, generator_feat_match_loss=3.87, over 1006.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 8.0 2024-02-23 00:43:48,407 INFO [train.py:845] (3/4) Start epoch 257 2024-02-23 00:46:39,076 INFO [train.py:471] (3/4) Epoch 257, batch 28, global_batch_idx: 9500, batch size: 110, loss[discriminator_loss=2.506, discriminator_real_loss=1.237, discriminator_fake_loss=1.269, generator_loss=30.06, generator_mel_loss=20.71, generator_kl_loss=1.866, generator_dur_loss=1.486, generator_adv_loss=2.342, generator_feat_match_loss=3.656, over 110.00 samples.], tot_loss[discriminator_loss=2.438, discriminator_real_loss=1.233, discriminator_fake_loss=1.205, generator_loss=30.66, generator_mel_loss=20.72, generator_kl_loss=1.918, generator_dur_loss=1.505, generator_adv_loss=2.436, generator_feat_match_loss=4.08, over 2300.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 8.0 2024-02-23 00:47:18,612 INFO [train.py:845] (3/4) Start epoch 258 2024-02-23 00:50:45,120 INFO [train.py:845] (3/4) Start epoch 259 2024-02-23 00:51:21,404 INFO [train.py:471] (3/4) Epoch 259, batch 4, global_batch_idx: 9550, batch size: 53, loss[discriminator_loss=2.242, discriminator_real_loss=1.062, discriminator_fake_loss=1.179, generator_loss=30.57, generator_mel_loss=20.36, generator_kl_loss=1.919, generator_dur_loss=1.531, generator_adv_loss=2.551, generator_feat_match_loss=4.207, over 53.00 samples.], tot_loss[discriminator_loss=2.315, discriminator_real_loss=1.183, discriminator_fake_loss=1.131, generator_loss=30.97, generator_mel_loss=20.61, generator_kl_loss=1.963, generator_dur_loss=1.517, generator_adv_loss=2.493, generator_feat_match_loss=4.383, over 355.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 8.0 2024-02-23 00:54:12,825 INFO [train.py:845] (3/4) Start epoch 260 2024-02-23 00:55:55,727 INFO [train.py:471] (3/4) Epoch 260, batch 17, global_batch_idx: 9600, batch size: 53, loss[discriminator_loss=2.211, discriminator_real_loss=1.09, discriminator_fake_loss=1.12, generator_loss=31.78, generator_mel_loss=20.56, generator_kl_loss=1.933, generator_dur_loss=1.506, generator_adv_loss=2.799, generator_feat_match_loss=4.988, over 53.00 samples.], tot_loss[discriminator_loss=2.419, discriminator_real_loss=1.232, discriminator_fake_loss=1.187, generator_loss=31.13, generator_mel_loss=20.82, generator_kl_loss=1.945, generator_dur_loss=1.506, generator_adv_loss=2.496, generator_feat_match_loss=4.36, over 1468.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-23 00:55:55,729 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 00:56:04,514 INFO [train.py:534] (3/4) Epoch 260, validation: discriminator_loss=2.182, discriminator_real_loss=1.128, discriminator_fake_loss=1.054, generator_loss=32.83, generator_mel_loss=21.43, generator_kl_loss=1.963, generator_dur_loss=1.514, generator_adv_loss=2.71, generator_feat_match_loss=5.211, over 100.00 samples. 2024-02-23 00:56:04,515 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 00:57:49,259 INFO [train.py:845] (3/4) Start epoch 261 2024-02-23 01:00:44,501 INFO [train.py:471] (3/4) Epoch 261, batch 30, global_batch_idx: 9650, batch size: 73, loss[discriminator_loss=2.441, discriminator_real_loss=1.28, discriminator_fake_loss=1.161, generator_loss=29.49, generator_mel_loss=20.02, generator_kl_loss=1.966, generator_dur_loss=1.507, generator_adv_loss=2.355, generator_feat_match_loss=3.645, over 73.00 samples.], tot_loss[discriminator_loss=2.505, discriminator_real_loss=1.289, discriminator_fake_loss=1.216, generator_loss=30.22, generator_mel_loss=20.47, generator_kl_loss=1.917, generator_dur_loss=1.516, generator_adv_loss=2.433, generator_feat_match_loss=3.882, over 2150.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-23 01:01:16,101 INFO [train.py:845] (3/4) Start epoch 262 2024-02-23 01:04:38,947 INFO [train.py:845] (3/4) Start epoch 263 2024-02-23 01:05:23,212 INFO [train.py:471] (3/4) Epoch 263, batch 6, global_batch_idx: 9700, batch size: 53, loss[discriminator_loss=2.566, discriminator_real_loss=1.45, discriminator_fake_loss=1.116, generator_loss=29.76, generator_mel_loss=19.88, generator_kl_loss=2.076, generator_dur_loss=1.515, generator_adv_loss=2.516, generator_feat_match_loss=3.773, over 53.00 samples.], tot_loss[discriminator_loss=2.746, discriminator_real_loss=1.503, discriminator_fake_loss=1.243, generator_loss=29.89, generator_mel_loss=20.37, generator_kl_loss=1.931, generator_dur_loss=1.512, generator_adv_loss=2.481, generator_feat_match_loss=3.591, over 426.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-23 01:08:08,634 INFO [train.py:845] (3/4) Start epoch 264 2024-02-23 01:09:56,048 INFO [train.py:471] (3/4) Epoch 264, batch 19, global_batch_idx: 9750, batch size: 126, loss[discriminator_loss=2.453, discriminator_real_loss=1.211, discriminator_fake_loss=1.242, generator_loss=30.94, generator_mel_loss=20.86, generator_kl_loss=1.953, generator_dur_loss=1.495, generator_adv_loss=2.395, generator_feat_match_loss=4.242, over 126.00 samples.], tot_loss[discriminator_loss=2.437, discriminator_real_loss=1.24, discriminator_fake_loss=1.197, generator_loss=30.68, generator_mel_loss=20.62, generator_kl_loss=1.906, generator_dur_loss=1.51, generator_adv_loss=2.471, generator_feat_match_loss=4.178, over 1395.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-23 01:11:24,741 INFO [train.py:845] (3/4) Start epoch 265 2024-02-23 01:14:36,570 INFO [train.py:471] (3/4) Epoch 265, batch 32, global_batch_idx: 9800, batch size: 69, loss[discriminator_loss=2.5, discriminator_real_loss=1.274, discriminator_fake_loss=1.226, generator_loss=29.71, generator_mel_loss=20.42, generator_kl_loss=1.913, generator_dur_loss=1.531, generator_adv_loss=2.43, generator_feat_match_loss=3.42, over 69.00 samples.], tot_loss[discriminator_loss=2.47, discriminator_real_loss=1.262, discriminator_fake_loss=1.207, generator_loss=30.39, generator_mel_loss=20.6, generator_kl_loss=1.935, generator_dur_loss=1.509, generator_adv_loss=2.417, generator_feat_match_loss=3.926, over 2615.00 samples.], cur_lr_g: 1.94e-04, cur_lr_d: 1.94e-04, grad_scale: 16.0 2024-02-23 01:14:36,572 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 01:14:45,484 INFO [train.py:534] (3/4) Epoch 265, validation: discriminator_loss=2.553, discriminator_real_loss=1.249, discriminator_fake_loss=1.304, generator_loss=30.94, generator_mel_loss=21.28, generator_kl_loss=2.113, generator_dur_loss=1.509, generator_adv_loss=2.288, generator_feat_match_loss=3.747, over 100.00 samples. 2024-02-23 01:14:45,485 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 01:15:02,880 INFO [train.py:845] (3/4) Start epoch 266 2024-02-23 01:18:31,973 INFO [train.py:845] (3/4) Start epoch 267 2024-02-23 01:19:32,451 INFO [train.py:471] (3/4) Epoch 267, batch 8, global_batch_idx: 9850, batch size: 64, loss[discriminator_loss=2.387, discriminator_real_loss=1.204, discriminator_fake_loss=1.184, generator_loss=29.95, generator_mel_loss=20.16, generator_kl_loss=1.857, generator_dur_loss=1.51, generator_adv_loss=2.473, generator_feat_match_loss=3.949, over 64.00 samples.], tot_loss[discriminator_loss=2.435, discriminator_real_loss=1.244, discriminator_fake_loss=1.191, generator_loss=30.32, generator_mel_loss=20.49, generator_kl_loss=1.91, generator_dur_loss=1.511, generator_adv_loss=2.387, generator_feat_match_loss=4.029, over 639.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 01:22:03,303 INFO [train.py:845] (3/4) Start epoch 268 2024-02-23 01:24:16,400 INFO [train.py:471] (3/4) Epoch 268, batch 21, global_batch_idx: 9900, batch size: 76, loss[discriminator_loss=2.395, discriminator_real_loss=1.168, discriminator_fake_loss=1.226, generator_loss=29.81, generator_mel_loss=20.12, generator_kl_loss=1.901, generator_dur_loss=1.527, generator_adv_loss=2.416, generator_feat_match_loss=3.848, over 76.00 samples.], tot_loss[discriminator_loss=2.429, discriminator_real_loss=1.212, discriminator_fake_loss=1.217, generator_loss=29.8, generator_mel_loss=20.4, generator_kl_loss=1.912, generator_dur_loss=1.511, generator_adv_loss=2.329, generator_feat_match_loss=3.644, over 1478.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 01:25:35,844 INFO [train.py:845] (3/4) Start epoch 269 2024-02-23 01:28:54,502 INFO [train.py:471] (3/4) Epoch 269, batch 34, global_batch_idx: 9950, batch size: 65, loss[discriminator_loss=2.326, discriminator_real_loss=1.233, discriminator_fake_loss=1.093, generator_loss=31.17, generator_mel_loss=20.75, generator_kl_loss=2.154, generator_dur_loss=1.502, generator_adv_loss=2.332, generator_feat_match_loss=4.434, over 65.00 samples.], tot_loss[discriminator_loss=2.462, discriminator_real_loss=1.247, discriminator_fake_loss=1.214, generator_loss=30.27, generator_mel_loss=20.6, generator_kl_loss=1.93, generator_dur_loss=1.514, generator_adv_loss=2.375, generator_feat_match_loss=3.849, over 2626.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 01:29:05,117 INFO [train.py:845] (3/4) Start epoch 270 2024-02-23 01:32:28,109 INFO [train.py:845] (3/4) Start epoch 271 2024-02-23 01:33:34,307 INFO [train.py:471] (3/4) Epoch 271, batch 10, global_batch_idx: 10000, batch size: 54, loss[discriminator_loss=2.469, discriminator_real_loss=1.423, discriminator_fake_loss=1.046, generator_loss=31.07, generator_mel_loss=20.76, generator_kl_loss=1.843, generator_dur_loss=1.518, generator_adv_loss=2.416, generator_feat_match_loss=4.531, over 54.00 samples.], tot_loss[discriminator_loss=2.4, discriminator_real_loss=1.197, discriminator_fake_loss=1.203, generator_loss=30.42, generator_mel_loss=20.59, generator_kl_loss=1.922, generator_dur_loss=1.518, generator_adv_loss=2.391, generator_feat_match_loss=4.002, over 704.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 32.0 2024-02-23 01:33:34,307 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 01:33:42,148 INFO [train.py:534] (3/4) Epoch 271, validation: discriminator_loss=2.414, discriminator_real_loss=1.118, discriminator_fake_loss=1.296, generator_loss=31.98, generator_mel_loss=21.33, generator_kl_loss=2.093, generator_dur_loss=1.499, generator_adv_loss=2.373, generator_feat_match_loss=4.684, over 100.00 samples. 2024-02-23 01:33:42,149 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 01:36:03,503 INFO [train.py:845] (3/4) Start epoch 272 2024-02-23 01:38:17,565 INFO [train.py:471] (3/4) Epoch 272, batch 23, global_batch_idx: 10050, batch size: 126, loss[discriminator_loss=2.355, discriminator_real_loss=1.15, discriminator_fake_loss=1.205, generator_loss=30.92, generator_mel_loss=20.84, generator_kl_loss=1.9, generator_dur_loss=1.497, generator_adv_loss=2.432, generator_feat_match_loss=4.254, over 126.00 samples.], tot_loss[discriminator_loss=2.401, discriminator_real_loss=1.213, discriminator_fake_loss=1.188, generator_loss=30.53, generator_mel_loss=20.48, generator_kl_loss=1.923, generator_dur_loss=1.504, generator_adv_loss=2.472, generator_feat_match_loss=4.149, over 1829.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 01:39:30,950 INFO [train.py:845] (3/4) Start epoch 273 2024-02-23 01:43:04,225 INFO [train.py:471] (3/4) Epoch 273, batch 36, global_batch_idx: 10100, batch size: 56, loss[discriminator_loss=2.541, discriminator_real_loss=1.223, discriminator_fake_loss=1.318, generator_loss=30.15, generator_mel_loss=21.16, generator_kl_loss=1.953, generator_dur_loss=1.516, generator_adv_loss=2.172, generator_feat_match_loss=3.346, over 56.00 samples.], tot_loss[discriminator_loss=2.561, discriminator_real_loss=1.358, discriminator_fake_loss=1.203, generator_loss=30.29, generator_mel_loss=20.56, generator_kl_loss=1.93, generator_dur_loss=1.509, generator_adv_loss=2.464, generator_feat_match_loss=3.822, over 2760.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 01:43:04,661 INFO [train.py:845] (3/4) Start epoch 274 2024-02-23 01:46:33,197 INFO [train.py:845] (3/4) Start epoch 275 2024-02-23 01:47:52,108 INFO [train.py:471] (3/4) Epoch 275, batch 12, global_batch_idx: 10150, batch size: 69, loss[discriminator_loss=2.293, discriminator_real_loss=1.117, discriminator_fake_loss=1.176, generator_loss=30.55, generator_mel_loss=20.43, generator_kl_loss=1.881, generator_dur_loss=1.5, generator_adv_loss=2.473, generator_feat_match_loss=4.266, over 69.00 samples.], tot_loss[discriminator_loss=2.4, discriminator_real_loss=1.201, discriminator_fake_loss=1.2, generator_loss=30.21, generator_mel_loss=20.47, generator_kl_loss=1.934, generator_dur_loss=1.507, generator_adv_loss=2.398, generator_feat_match_loss=3.9, over 937.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 01:50:03,734 INFO [train.py:845] (3/4) Start epoch 276 2024-02-23 01:52:23,959 INFO [train.py:471] (3/4) Epoch 276, batch 25, global_batch_idx: 10200, batch size: 67, loss[discriminator_loss=2.537, discriminator_real_loss=1.344, discriminator_fake_loss=1.193, generator_loss=30.63, generator_mel_loss=20.46, generator_kl_loss=1.917, generator_dur_loss=1.527, generator_adv_loss=2.621, generator_feat_match_loss=4.102, over 67.00 samples.], tot_loss[discriminator_loss=2.379, discriminator_real_loss=1.193, discriminator_fake_loss=1.186, generator_loss=30.53, generator_mel_loss=20.57, generator_kl_loss=1.916, generator_dur_loss=1.508, generator_adv_loss=2.418, generator_feat_match_loss=4.117, over 1755.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 01:52:23,961 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 01:52:32,157 INFO [train.py:534] (3/4) Epoch 276, validation: discriminator_loss=2.562, discriminator_real_loss=1.431, discriminator_fake_loss=1.131, generator_loss=31.64, generator_mel_loss=21.27, generator_kl_loss=2.058, generator_dur_loss=1.501, generator_adv_loss=2.504, generator_feat_match_loss=4.301, over 100.00 samples. 2024-02-23 01:52:32,158 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 01:53:38,848 INFO [train.py:845] (3/4) Start epoch 277 2024-02-23 01:57:09,424 INFO [train.py:845] (3/4) Start epoch 278 2024-02-23 01:57:30,336 INFO [train.py:471] (3/4) Epoch 278, batch 1, global_batch_idx: 10250, batch size: 90, loss[discriminator_loss=2.701, discriminator_real_loss=1.401, discriminator_fake_loss=1.3, generator_loss=29.54, generator_mel_loss=20.28, generator_kl_loss=1.898, generator_dur_loss=1.497, generator_adv_loss=2.26, generator_feat_match_loss=3.6, over 90.00 samples.], tot_loss[discriminator_loss=2.724, discriminator_real_loss=1.611, discriminator_fake_loss=1.113, generator_loss=29.91, generator_mel_loss=20.54, generator_kl_loss=1.905, generator_dur_loss=1.5, generator_adv_loss=2.293, generator_feat_match_loss=3.671, over 180.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:00:36,668 INFO [train.py:845] (3/4) Start epoch 279 2024-02-23 02:02:06,799 INFO [train.py:471] (3/4) Epoch 279, batch 14, global_batch_idx: 10300, batch size: 55, loss[discriminator_loss=2.404, discriminator_real_loss=1.181, discriminator_fake_loss=1.224, generator_loss=29.48, generator_mel_loss=19.91, generator_kl_loss=1.91, generator_dur_loss=1.526, generator_adv_loss=2.301, generator_feat_match_loss=3.83, over 55.00 samples.], tot_loss[discriminator_loss=2.416, discriminator_real_loss=1.228, discriminator_fake_loss=1.188, generator_loss=30.01, generator_mel_loss=20.28, generator_kl_loss=1.926, generator_dur_loss=1.516, generator_adv_loss=2.398, generator_feat_match_loss=3.888, over 966.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:04:08,308 INFO [train.py:845] (3/4) Start epoch 280 2024-02-23 02:06:47,177 INFO [train.py:471] (3/4) Epoch 280, batch 27, global_batch_idx: 10350, batch size: 54, loss[discriminator_loss=2.496, discriminator_real_loss=1.139, discriminator_fake_loss=1.357, generator_loss=30.1, generator_mel_loss=20.24, generator_kl_loss=1.889, generator_dur_loss=1.549, generator_adv_loss=2.254, generator_feat_match_loss=4.164, over 54.00 samples.], tot_loss[discriminator_loss=2.517, discriminator_real_loss=1.305, discriminator_fake_loss=1.212, generator_loss=30.91, generator_mel_loss=20.63, generator_kl_loss=1.93, generator_dur_loss=1.511, generator_adv_loss=2.568, generator_feat_match_loss=4.275, over 1906.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:07:34,466 INFO [train.py:845] (3/4) Start epoch 281 2024-02-23 02:11:05,337 INFO [train.py:845] (3/4) Start epoch 282 2024-02-23 02:11:36,266 INFO [train.py:471] (3/4) Epoch 282, batch 3, global_batch_idx: 10400, batch size: 63, loss[discriminator_loss=2.438, discriminator_real_loss=1.209, discriminator_fake_loss=1.228, generator_loss=30.03, generator_mel_loss=20.72, generator_kl_loss=1.915, generator_dur_loss=1.51, generator_adv_loss=2.316, generator_feat_match_loss=3.574, over 63.00 samples.], tot_loss[discriminator_loss=2.434, discriminator_real_loss=1.221, discriminator_fake_loss=1.213, generator_loss=29.66, generator_mel_loss=20.29, generator_kl_loss=1.896, generator_dur_loss=1.514, generator_adv_loss=2.343, generator_feat_match_loss=3.617, over 268.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 32.0 2024-02-23 02:11:36,267 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 02:11:44,147 INFO [train.py:534] (3/4) Epoch 282, validation: discriminator_loss=2.473, discriminator_real_loss=1.178, discriminator_fake_loss=1.295, generator_loss=30.98, generator_mel_loss=21.46, generator_kl_loss=1.989, generator_dur_loss=1.509, generator_adv_loss=2.29, generator_feat_match_loss=3.737, over 100.00 samples. 2024-02-23 02:11:44,147 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 02:14:39,422 INFO [train.py:845] (3/4) Start epoch 283 2024-02-23 02:16:09,263 INFO [train.py:471] (3/4) Epoch 283, batch 16, global_batch_idx: 10450, batch size: 63, loss[discriminator_loss=2.27, discriminator_real_loss=1.209, discriminator_fake_loss=1.061, generator_loss=30.03, generator_mel_loss=19.63, generator_kl_loss=1.956, generator_dur_loss=1.486, generator_adv_loss=2.461, generator_feat_match_loss=4.496, over 63.00 samples.], tot_loss[discriminator_loss=2.393, discriminator_real_loss=1.213, discriminator_fake_loss=1.179, generator_loss=30.23, generator_mel_loss=20.32, generator_kl_loss=1.929, generator_dur_loss=1.511, generator_adv_loss=2.42, generator_feat_match_loss=4.049, over 1136.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:18:02,809 INFO [train.py:845] (3/4) Start epoch 284 2024-02-23 02:20:55,888 INFO [train.py:471] (3/4) Epoch 284, batch 29, global_batch_idx: 10500, batch size: 73, loss[discriminator_loss=2.273, discriminator_real_loss=1.203, discriminator_fake_loss=1.069, generator_loss=30.82, generator_mel_loss=20.43, generator_kl_loss=2.014, generator_dur_loss=1.493, generator_adv_loss=2.527, generator_feat_match_loss=4.355, over 73.00 samples.], tot_loss[discriminator_loss=2.378, discriminator_real_loss=1.202, discriminator_fake_loss=1.176, generator_loss=30.49, generator_mel_loss=20.41, generator_kl_loss=1.961, generator_dur_loss=1.508, generator_adv_loss=2.447, generator_feat_match_loss=4.164, over 2209.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:21:38,918 INFO [train.py:845] (3/4) Start epoch 285 2024-02-23 02:25:12,288 INFO [train.py:845] (3/4) Start epoch 286 2024-02-23 02:25:51,326 INFO [train.py:471] (3/4) Epoch 286, batch 5, global_batch_idx: 10550, batch size: 69, loss[discriminator_loss=2.305, discriminator_real_loss=1.109, discriminator_fake_loss=1.196, generator_loss=30.24, generator_mel_loss=19.98, generator_kl_loss=1.983, generator_dur_loss=1.512, generator_adv_loss=2.428, generator_feat_match_loss=4.34, over 69.00 samples.], tot_loss[discriminator_loss=2.376, discriminator_real_loss=1.189, discriminator_fake_loss=1.188, generator_loss=30.32, generator_mel_loss=20.23, generator_kl_loss=1.938, generator_dur_loss=1.515, generator_adv_loss=2.42, generator_feat_match_loss=4.225, over 380.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:28:44,333 INFO [train.py:845] (3/4) Start epoch 287 2024-02-23 02:30:36,768 INFO [train.py:471] (3/4) Epoch 287, batch 18, global_batch_idx: 10600, batch size: 71, loss[discriminator_loss=2.295, discriminator_real_loss=1.368, discriminator_fake_loss=0.9272, generator_loss=32.16, generator_mel_loss=20.66, generator_kl_loss=1.926, generator_dur_loss=1.503, generator_adv_loss=2.684, generator_feat_match_loss=5.383, over 71.00 samples.], tot_loss[discriminator_loss=2.404, discriminator_real_loss=1.222, discriminator_fake_loss=1.182, generator_loss=30.52, generator_mel_loss=20.45, generator_kl_loss=1.919, generator_dur_loss=1.508, generator_adv_loss=2.465, generator_feat_match_loss=4.182, over 1250.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:30:36,770 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 02:30:44,619 INFO [train.py:534] (3/4) Epoch 287, validation: discriminator_loss=2.075, discriminator_real_loss=0.9959, discriminator_fake_loss=1.079, generator_loss=33.42, generator_mel_loss=21.64, generator_kl_loss=2.035, generator_dur_loss=1.502, generator_adv_loss=2.61, generator_feat_match_loss=5.626, over 100.00 samples. 2024-02-23 02:30:44,619 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 02:32:20,542 INFO [train.py:845] (3/4) Start epoch 288 2024-02-23 02:35:18,382 INFO [train.py:471] (3/4) Epoch 288, batch 31, global_batch_idx: 10650, batch size: 126, loss[discriminator_loss=2.336, discriminator_real_loss=1.132, discriminator_fake_loss=1.203, generator_loss=30.46, generator_mel_loss=20.42, generator_kl_loss=1.937, generator_dur_loss=1.498, generator_adv_loss=2.402, generator_feat_match_loss=4.195, over 126.00 samples.], tot_loss[discriminator_loss=2.422, discriminator_real_loss=1.202, discriminator_fake_loss=1.22, generator_loss=29.95, generator_mel_loss=20.36, generator_kl_loss=1.938, generator_dur_loss=1.508, generator_adv_loss=2.342, generator_feat_match_loss=3.804, over 2270.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:35:49,179 INFO [train.py:845] (3/4) Start epoch 289 2024-02-23 02:39:17,954 INFO [train.py:845] (3/4) Start epoch 290 2024-02-23 02:40:14,568 INFO [train.py:471] (3/4) Epoch 290, batch 7, global_batch_idx: 10700, batch size: 90, loss[discriminator_loss=2.406, discriminator_real_loss=1.182, discriminator_fake_loss=1.226, generator_loss=30.18, generator_mel_loss=20.62, generator_kl_loss=1.893, generator_dur_loss=1.496, generator_adv_loss=2.377, generator_feat_match_loss=3.795, over 90.00 samples.], tot_loss[discriminator_loss=2.462, discriminator_real_loss=1.263, discriminator_fake_loss=1.199, generator_loss=30.08, generator_mel_loss=20.38, generator_kl_loss=1.922, generator_dur_loss=1.504, generator_adv_loss=2.379, generator_feat_match_loss=3.89, over 672.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:42:46,364 INFO [train.py:845] (3/4) Start epoch 291 2024-02-23 02:44:48,844 INFO [train.py:471] (3/4) Epoch 291, batch 20, global_batch_idx: 10750, batch size: 126, loss[discriminator_loss=2.398, discriminator_real_loss=1.064, discriminator_fake_loss=1.333, generator_loss=30.59, generator_mel_loss=20.43, generator_kl_loss=1.991, generator_dur_loss=1.477, generator_adv_loss=2.361, generator_feat_match_loss=4.332, over 126.00 samples.], tot_loss[discriminator_loss=2.437, discriminator_real_loss=1.241, discriminator_fake_loss=1.195, generator_loss=30.7, generator_mel_loss=20.39, generator_kl_loss=1.928, generator_dur_loss=1.507, generator_adv_loss=2.507, generator_feat_match_loss=4.366, over 1532.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 02:46:13,019 INFO [train.py:845] (3/4) Start epoch 292 2024-02-23 02:49:23,792 INFO [train.py:471] (3/4) Epoch 292, batch 33, global_batch_idx: 10800, batch size: 71, loss[discriminator_loss=2.438, discriminator_real_loss=1.266, discriminator_fake_loss=1.172, generator_loss=30.38, generator_mel_loss=20.76, generator_kl_loss=1.945, generator_dur_loss=1.505, generator_adv_loss=2.369, generator_feat_match_loss=3.799, over 71.00 samples.], tot_loss[discriminator_loss=2.465, discriminator_real_loss=1.258, discriminator_fake_loss=1.206, generator_loss=30.51, generator_mel_loss=20.44, generator_kl_loss=1.933, generator_dur_loss=1.507, generator_adv_loss=2.484, generator_feat_match_loss=4.148, over 2386.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 32.0 2024-02-23 02:49:23,793 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 02:49:31,840 INFO [train.py:534] (3/4) Epoch 292, validation: discriminator_loss=2.433, discriminator_real_loss=1.199, discriminator_fake_loss=1.234, generator_loss=30.61, generator_mel_loss=21.11, generator_kl_loss=2.047, generator_dur_loss=1.493, generator_adv_loss=2.275, generator_feat_match_loss=3.682, over 100.00 samples. 2024-02-23 02:49:31,841 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 02:49:51,977 INFO [train.py:845] (3/4) Start epoch 293 2024-02-23 02:53:23,737 INFO [train.py:845] (3/4) Start epoch 294 2024-02-23 02:54:24,995 INFO [train.py:471] (3/4) Epoch 294, batch 9, global_batch_idx: 10850, batch size: 60, loss[discriminator_loss=2.744, discriminator_real_loss=1.461, discriminator_fake_loss=1.283, generator_loss=29.86, generator_mel_loss=20.15, generator_kl_loss=1.844, generator_dur_loss=1.484, generator_adv_loss=2.535, generator_feat_match_loss=3.846, over 60.00 samples.], tot_loss[discriminator_loss=2.347, discriminator_real_loss=1.139, discriminator_fake_loss=1.208, generator_loss=30.95, generator_mel_loss=20.51, generator_kl_loss=1.947, generator_dur_loss=1.5, generator_adv_loss=2.546, generator_feat_match_loss=4.447, over 800.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 32.0 2024-02-23 02:56:52,441 INFO [train.py:845] (3/4) Start epoch 295 2024-02-23 02:58:57,693 INFO [train.py:471] (3/4) Epoch 295, batch 22, global_batch_idx: 10900, batch size: 58, loss[discriminator_loss=2.395, discriminator_real_loss=1.264, discriminator_fake_loss=1.131, generator_loss=28.89, generator_mel_loss=19.41, generator_kl_loss=1.97, generator_dur_loss=1.506, generator_adv_loss=2.383, generator_feat_match_loss=3.617, over 58.00 samples.], tot_loss[discriminator_loss=2.509, discriminator_real_loss=1.288, discriminator_fake_loss=1.221, generator_loss=29.48, generator_mel_loss=20.06, generator_kl_loss=1.899, generator_dur_loss=1.513, generator_adv_loss=2.381, generator_feat_match_loss=3.627, over 1539.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 03:00:16,024 INFO [train.py:845] (3/4) Start epoch 296 2024-02-23 03:03:38,370 INFO [train.py:471] (3/4) Epoch 296, batch 35, global_batch_idx: 10950, batch size: 59, loss[discriminator_loss=2.336, discriminator_real_loss=1.213, discriminator_fake_loss=1.123, generator_loss=30.18, generator_mel_loss=19.99, generator_kl_loss=1.88, generator_dur_loss=1.51, generator_adv_loss=2.531, generator_feat_match_loss=4.266, over 59.00 samples.], tot_loss[discriminator_loss=2.47, discriminator_real_loss=1.258, discriminator_fake_loss=1.211, generator_loss=30.17, generator_mel_loss=20.22, generator_kl_loss=1.933, generator_dur_loss=1.503, generator_adv_loss=2.441, generator_feat_match_loss=4.072, over 2772.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 03:03:44,705 INFO [train.py:845] (3/4) Start epoch 297 2024-02-23 03:07:13,721 INFO [train.py:845] (3/4) Start epoch 298 2024-02-23 03:08:22,447 INFO [train.py:471] (3/4) Epoch 298, batch 11, global_batch_idx: 11000, batch size: 67, loss[discriminator_loss=2.395, discriminator_real_loss=1.105, discriminator_fake_loss=1.289, generator_loss=30.2, generator_mel_loss=20.04, generator_kl_loss=1.854, generator_dur_loss=1.506, generator_adv_loss=2.432, generator_feat_match_loss=4.375, over 67.00 samples.], tot_loss[discriminator_loss=2.38, discriminator_real_loss=1.179, discriminator_fake_loss=1.201, generator_loss=30.29, generator_mel_loss=20.31, generator_kl_loss=1.935, generator_dur_loss=1.506, generator_adv_loss=2.415, generator_feat_match_loss=4.125, over 837.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 03:08:22,449 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 03:08:30,943 INFO [train.py:534] (3/4) Epoch 298, validation: discriminator_loss=2.331, discriminator_real_loss=1.139, discriminator_fake_loss=1.193, generator_loss=32.38, generator_mel_loss=21.65, generator_kl_loss=2.029, generator_dur_loss=1.503, generator_adv_loss=2.438, generator_feat_match_loss=4.753, over 100.00 samples. 2024-02-23 03:08:30,944 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 03:10:45,230 INFO [train.py:845] (3/4) Start epoch 299 2024-02-23 03:13:11,233 INFO [train.py:471] (3/4) Epoch 299, batch 24, global_batch_idx: 11050, batch size: 153, loss[discriminator_loss=2.445, discriminator_real_loss=1.324, discriminator_fake_loss=1.12, generator_loss=30.7, generator_mel_loss=20.39, generator_kl_loss=1.842, generator_dur_loss=1.468, generator_adv_loss=2.551, generator_feat_match_loss=4.453, over 153.00 samples.], tot_loss[discriminator_loss=2.404, discriminator_real_loss=1.223, discriminator_fake_loss=1.181, generator_loss=30.31, generator_mel_loss=20.23, generator_kl_loss=1.914, generator_dur_loss=1.5, generator_adv_loss=2.449, generator_feat_match_loss=4.216, over 1833.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 03:14:17,635 INFO [train.py:845] (3/4) Start epoch 300 2024-02-23 03:17:46,556 INFO [train.py:845] (3/4) Start epoch 301 2024-02-23 03:18:01,309 INFO [train.py:471] (3/4) Epoch 301, batch 0, global_batch_idx: 11100, batch size: 153, loss[discriminator_loss=2.48, discriminator_real_loss=1.258, discriminator_fake_loss=1.222, generator_loss=29.72, generator_mel_loss=20.14, generator_kl_loss=1.959, generator_dur_loss=1.478, generator_adv_loss=2.334, generator_feat_match_loss=3.816, over 153.00 samples.], tot_loss[discriminator_loss=2.48, discriminator_real_loss=1.258, discriminator_fake_loss=1.222, generator_loss=29.72, generator_mel_loss=20.14, generator_kl_loss=1.959, generator_dur_loss=1.478, generator_adv_loss=2.334, generator_feat_match_loss=3.816, over 153.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 03:21:12,882 INFO [train.py:845] (3/4) Start epoch 302 2024-02-23 03:22:32,050 INFO [train.py:471] (3/4) Epoch 302, batch 13, global_batch_idx: 11150, batch size: 53, loss[discriminator_loss=2.24, discriminator_real_loss=1.104, discriminator_fake_loss=1.137, generator_loss=29.53, generator_mel_loss=19.54, generator_kl_loss=1.929, generator_dur_loss=1.524, generator_adv_loss=2.4, generator_feat_match_loss=4.133, over 53.00 samples.], tot_loss[discriminator_loss=2.402, discriminator_real_loss=1.185, discriminator_fake_loss=1.217, generator_loss=29.71, generator_mel_loss=20.06, generator_kl_loss=1.948, generator_dur_loss=1.505, generator_adv_loss=2.342, generator_feat_match_loss=3.855, over 853.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 03:24:40,858 INFO [train.py:845] (3/4) Start epoch 303 2024-02-23 03:27:15,412 INFO [train.py:471] (3/4) Epoch 303, batch 26, global_batch_idx: 11200, batch size: 64, loss[discriminator_loss=2.623, discriminator_real_loss=1.46, discriminator_fake_loss=1.163, generator_loss=30.49, generator_mel_loss=20.34, generator_kl_loss=1.974, generator_dur_loss=1.489, generator_adv_loss=2.602, generator_feat_match_loss=4.09, over 64.00 samples.], tot_loss[discriminator_loss=2.348, discriminator_real_loss=1.183, discriminator_fake_loss=1.164, generator_loss=30.47, generator_mel_loss=20.27, generator_kl_loss=1.939, generator_dur_loss=1.504, generator_adv_loss=2.473, generator_feat_match_loss=4.287, over 1956.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 32.0 2024-02-23 03:27:15,412 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 03:27:24,545 INFO [train.py:534] (3/4) Epoch 303, validation: discriminator_loss=2.704, discriminator_real_loss=1.501, discriminator_fake_loss=1.202, generator_loss=31.55, generator_mel_loss=21.15, generator_kl_loss=2.052, generator_dur_loss=1.494, generator_adv_loss=2.451, generator_feat_match_loss=4.404, over 100.00 samples. 2024-02-23 03:27:24,546 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 03:28:16,414 INFO [train.py:845] (3/4) Start epoch 304 2024-02-23 03:31:48,740 INFO [train.py:845] (3/4) Start epoch 305 2024-02-23 03:32:13,397 INFO [train.py:471] (3/4) Epoch 305, batch 2, global_batch_idx: 11250, batch size: 76, loss[discriminator_loss=2.506, discriminator_real_loss=1.217, discriminator_fake_loss=1.289, generator_loss=29.52, generator_mel_loss=20.27, generator_kl_loss=1.938, generator_dur_loss=1.537, generator_adv_loss=2.195, generator_feat_match_loss=3.588, over 76.00 samples.], tot_loss[discriminator_loss=2.479, discriminator_real_loss=1.221, discriminator_fake_loss=1.257, generator_loss=29.75, generator_mel_loss=20.34, generator_kl_loss=1.925, generator_dur_loss=1.514, generator_adv_loss=2.277, generator_feat_match_loss=3.692, over 220.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 03:35:13,676 INFO [train.py:845] (3/4) Start epoch 306 2024-02-23 03:36:44,489 INFO [train.py:471] (3/4) Epoch 306, batch 15, global_batch_idx: 11300, batch size: 101, loss[discriminator_loss=2.371, discriminator_real_loss=1.147, discriminator_fake_loss=1.224, generator_loss=30.09, generator_mel_loss=19.99, generator_kl_loss=1.948, generator_dur_loss=1.489, generator_adv_loss=2.4, generator_feat_match_loss=4.266, over 101.00 samples.], tot_loss[discriminator_loss=2.375, discriminator_real_loss=1.188, discriminator_fake_loss=1.188, generator_loss=30.31, generator_mel_loss=20.33, generator_kl_loss=1.965, generator_dur_loss=1.507, generator_adv_loss=2.424, generator_feat_match_loss=4.082, over 1161.00 samples.], cur_lr_g: 1.93e-04, cur_lr_d: 1.93e-04, grad_scale: 16.0 2024-02-23 03:38:39,727 INFO [train.py:845] (3/4) Start epoch 307 2024-02-23 03:41:21,375 INFO [train.py:471] (3/4) Epoch 307, batch 28, global_batch_idx: 11350, batch size: 64, loss[discriminator_loss=2.4, discriminator_real_loss=1.164, discriminator_fake_loss=1.236, generator_loss=30.1, generator_mel_loss=20.02, generator_kl_loss=1.859, generator_dur_loss=1.504, generator_adv_loss=2.547, generator_feat_match_loss=4.168, over 64.00 samples.], tot_loss[discriminator_loss=2.366, discriminator_real_loss=1.193, discriminator_fake_loss=1.173, generator_loss=30.34, generator_mel_loss=20.17, generator_kl_loss=1.936, generator_dur_loss=1.507, generator_adv_loss=2.469, generator_feat_match_loss=4.263, over 1886.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 03:42:07,440 INFO [train.py:845] (3/4) Start epoch 308 2024-02-23 03:45:34,190 INFO [train.py:845] (3/4) Start epoch 309 2024-02-23 03:46:06,316 INFO [train.py:471] (3/4) Epoch 309, batch 4, global_batch_idx: 11400, batch size: 90, loss[discriminator_loss=2.434, discriminator_real_loss=1.397, discriminator_fake_loss=1.036, generator_loss=30.06, generator_mel_loss=20.23, generator_kl_loss=1.869, generator_dur_loss=1.482, generator_adv_loss=2.404, generator_feat_match_loss=4.074, over 90.00 samples.], tot_loss[discriminator_loss=2.374, discriminator_real_loss=1.256, discriminator_fake_loss=1.118, generator_loss=30.14, generator_mel_loss=20.04, generator_kl_loss=1.876, generator_dur_loss=1.502, generator_adv_loss=2.481, generator_feat_match_loss=4.242, over 341.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 03:46:06,317 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 03:46:14,932 INFO [train.py:534] (3/4) Epoch 309, validation: discriminator_loss=2.501, discriminator_real_loss=1.044, discriminator_fake_loss=1.458, generator_loss=30.64, generator_mel_loss=20.79, generator_kl_loss=2.017, generator_dur_loss=1.505, generator_adv_loss=2.224, generator_feat_match_loss=4.102, over 100.00 samples. 2024-02-23 03:46:14,933 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 03:49:14,594 INFO [train.py:845] (3/4) Start epoch 310 2024-02-23 03:51:06,728 INFO [train.py:471] (3/4) Epoch 310, batch 17, global_batch_idx: 11450, batch size: 126, loss[discriminator_loss=2.844, discriminator_real_loss=1.183, discriminator_fake_loss=1.662, generator_loss=30.13, generator_mel_loss=20.5, generator_kl_loss=2.035, generator_dur_loss=1.457, generator_adv_loss=2.949, generator_feat_match_loss=3.191, over 126.00 samples.], tot_loss[discriminator_loss=2.583, discriminator_real_loss=1.295, discriminator_fake_loss=1.287, generator_loss=29.66, generator_mel_loss=20.36, generator_kl_loss=1.969, generator_dur_loss=1.496, generator_adv_loss=2.291, generator_feat_match_loss=3.549, over 1485.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 03:52:43,134 INFO [train.py:845] (3/4) Start epoch 311 2024-02-23 03:55:32,683 INFO [train.py:471] (3/4) Epoch 311, batch 30, global_batch_idx: 11500, batch size: 61, loss[discriminator_loss=2.395, discriminator_real_loss=1.237, discriminator_fake_loss=1.158, generator_loss=29.28, generator_mel_loss=19.69, generator_kl_loss=1.969, generator_dur_loss=1.502, generator_adv_loss=2.363, generator_feat_match_loss=3.752, over 61.00 samples.], tot_loss[discriminator_loss=2.446, discriminator_real_loss=1.262, discriminator_fake_loss=1.184, generator_loss=29.96, generator_mel_loss=20.16, generator_kl_loss=1.926, generator_dur_loss=1.505, generator_adv_loss=2.414, generator_feat_match_loss=3.955, over 2119.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 03:56:07,548 INFO [train.py:845] (3/4) Start epoch 312 2024-02-23 03:59:33,551 INFO [train.py:845] (3/4) Start epoch 313 2024-02-23 04:00:20,467 INFO [train.py:471] (3/4) Epoch 313, batch 6, global_batch_idx: 11550, batch size: 54, loss[discriminator_loss=2.402, discriminator_real_loss=1.222, discriminator_fake_loss=1.182, generator_loss=30.03, generator_mel_loss=20.41, generator_kl_loss=1.905, generator_dur_loss=1.484, generator_adv_loss=2.346, generator_feat_match_loss=3.885, over 54.00 samples.], tot_loss[discriminator_loss=2.437, discriminator_real_loss=1.235, discriminator_fake_loss=1.202, generator_loss=29.69, generator_mel_loss=20.21, generator_kl_loss=1.948, generator_dur_loss=1.504, generator_adv_loss=2.33, generator_feat_match_loss=3.703, over 474.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:03:02,328 INFO [train.py:845] (3/4) Start epoch 314 2024-02-23 04:05:05,247 INFO [train.py:471] (3/4) Epoch 314, batch 19, global_batch_idx: 11600, batch size: 55, loss[discriminator_loss=2.602, discriminator_real_loss=1.146, discriminator_fake_loss=1.455, generator_loss=30.68, generator_mel_loss=20.11, generator_kl_loss=1.906, generator_dur_loss=1.525, generator_adv_loss=2.672, generator_feat_match_loss=4.461, over 55.00 samples.], tot_loss[discriminator_loss=2.396, discriminator_real_loss=1.187, discriminator_fake_loss=1.21, generator_loss=30.63, generator_mel_loss=20.38, generator_kl_loss=1.946, generator_dur_loss=1.501, generator_adv_loss=2.47, generator_feat_match_loss=4.335, over 1484.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2024-02-23 04:05:05,249 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 04:05:13,218 INFO [train.py:534] (3/4) Epoch 314, validation: discriminator_loss=2.443, discriminator_real_loss=1.45, discriminator_fake_loss=0.9932, generator_loss=31.98, generator_mel_loss=21.2, generator_kl_loss=2.121, generator_dur_loss=1.5, generator_adv_loss=2.658, generator_feat_match_loss=4.501, over 100.00 samples. 2024-02-23 04:05:13,219 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 04:06:39,624 INFO [train.py:845] (3/4) Start epoch 315 2024-02-23 04:09:54,965 INFO [train.py:471] (3/4) Epoch 315, batch 32, global_batch_idx: 11650, batch size: 67, loss[discriminator_loss=2.389, discriminator_real_loss=1.184, discriminator_fake_loss=1.205, generator_loss=30.08, generator_mel_loss=20.17, generator_kl_loss=1.99, generator_dur_loss=1.519, generator_adv_loss=2.43, generator_feat_match_loss=3.98, over 67.00 samples.], tot_loss[discriminator_loss=2.391, discriminator_real_loss=1.197, discriminator_fake_loss=1.194, generator_loss=29.95, generator_mel_loss=20.03, generator_kl_loss=1.942, generator_dur_loss=1.498, generator_adv_loss=2.393, generator_feat_match_loss=4.083, over 2595.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:10:14,204 INFO [train.py:845] (3/4) Start epoch 316 2024-02-23 04:13:46,227 INFO [train.py:845] (3/4) Start epoch 317 2024-02-23 04:14:47,717 INFO [train.py:471] (3/4) Epoch 317, batch 8, global_batch_idx: 11700, batch size: 63, loss[discriminator_loss=2.414, discriminator_real_loss=1.173, discriminator_fake_loss=1.242, generator_loss=30.58, generator_mel_loss=20.64, generator_kl_loss=1.947, generator_dur_loss=1.491, generator_adv_loss=2.291, generator_feat_match_loss=4.211, over 63.00 samples.], tot_loss[discriminator_loss=2.448, discriminator_real_loss=1.294, discriminator_fake_loss=1.154, generator_loss=30.76, generator_mel_loss=20.44, generator_kl_loss=1.947, generator_dur_loss=1.496, generator_adv_loss=2.491, generator_feat_match_loss=4.382, over 827.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:17:12,966 INFO [train.py:845] (3/4) Start epoch 318 2024-02-23 04:19:24,904 INFO [train.py:471] (3/4) Epoch 318, batch 21, global_batch_idx: 11750, batch size: 110, loss[discriminator_loss=2.602, discriminator_real_loss=1.175, discriminator_fake_loss=1.427, generator_loss=30.55, generator_mel_loss=20.59, generator_kl_loss=1.989, generator_dur_loss=1.468, generator_adv_loss=2.127, generator_feat_match_loss=4.375, over 110.00 samples.], tot_loss[discriminator_loss=2.504, discriminator_real_loss=1.296, discriminator_fake_loss=1.208, generator_loss=30.97, generator_mel_loss=20.34, generator_kl_loss=1.937, generator_dur_loss=1.502, generator_adv_loss=2.588, generator_feat_match_loss=4.609, over 1558.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:20:41,802 INFO [train.py:845] (3/4) Start epoch 319 2024-02-23 04:24:05,782 INFO [train.py:471] (3/4) Epoch 319, batch 34, global_batch_idx: 11800, batch size: 76, loss[discriminator_loss=2.357, discriminator_real_loss=1.256, discriminator_fake_loss=1.102, generator_loss=29.71, generator_mel_loss=19.79, generator_kl_loss=1.902, generator_dur_loss=1.504, generator_adv_loss=2.273, generator_feat_match_loss=4.242, over 76.00 samples.], tot_loss[discriminator_loss=2.429, discriminator_real_loss=1.231, discriminator_fake_loss=1.198, generator_loss=29.81, generator_mel_loss=19.97, generator_kl_loss=1.939, generator_dur_loss=1.499, generator_adv_loss=2.403, generator_feat_match_loss=4, over 2569.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:24:05,784 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 04:24:14,154 INFO [train.py:534] (3/4) Epoch 319, validation: discriminator_loss=2.514, discriminator_real_loss=0.9557, discriminator_fake_loss=1.558, generator_loss=30.1, generator_mel_loss=20.71, generator_kl_loss=2.031, generator_dur_loss=1.489, generator_adv_loss=1.866, generator_feat_match_loss=4.004, over 100.00 samples. 2024-02-23 04:24:14,155 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 04:24:23,322 INFO [train.py:845] (3/4) Start epoch 320 2024-02-23 04:27:56,698 INFO [train.py:845] (3/4) Start epoch 321 2024-02-23 04:29:00,060 INFO [train.py:471] (3/4) Epoch 321, batch 10, global_batch_idx: 11850, batch size: 71, loss[discriminator_loss=2.469, discriminator_real_loss=1.321, discriminator_fake_loss=1.146, generator_loss=30.63, generator_mel_loss=20.41, generator_kl_loss=1.952, generator_dur_loss=1.507, generator_adv_loss=2.436, generator_feat_match_loss=4.32, over 71.00 samples.], tot_loss[discriminator_loss=2.349, discriminator_real_loss=1.164, discriminator_fake_loss=1.185, generator_loss=30.23, generator_mel_loss=20.05, generator_kl_loss=1.945, generator_dur_loss=1.498, generator_adv_loss=2.43, generator_feat_match_loss=4.307, over 815.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:31:23,653 INFO [train.py:845] (3/4) Start epoch 322 2024-02-23 04:33:37,200 INFO [train.py:471] (3/4) Epoch 322, batch 23, global_batch_idx: 11900, batch size: 76, loss[discriminator_loss=2.24, discriminator_real_loss=1.076, discriminator_fake_loss=1.164, generator_loss=30.2, generator_mel_loss=19.55, generator_kl_loss=1.878, generator_dur_loss=1.498, generator_adv_loss=2.848, generator_feat_match_loss=4.418, over 76.00 samples.], tot_loss[discriminator_loss=2.408, discriminator_real_loss=1.23, discriminator_fake_loss=1.178, generator_loss=30.16, generator_mel_loss=19.98, generator_kl_loss=1.942, generator_dur_loss=1.501, generator_adv_loss=2.464, generator_feat_match_loss=4.277, over 1654.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:34:51,099 INFO [train.py:845] (3/4) Start epoch 323 2024-02-23 04:38:19,503 INFO [train.py:471] (3/4) Epoch 323, batch 36, global_batch_idx: 11950, batch size: 64, loss[discriminator_loss=2.395, discriminator_real_loss=1.264, discriminator_fake_loss=1.13, generator_loss=29.29, generator_mel_loss=19.35, generator_kl_loss=1.912, generator_dur_loss=1.502, generator_adv_loss=2.461, generator_feat_match_loss=4.066, over 64.00 samples.], tot_loss[discriminator_loss=2.42, discriminator_real_loss=1.25, discriminator_fake_loss=1.17, generator_loss=30.15, generator_mel_loss=19.92, generator_kl_loss=1.937, generator_dur_loss=1.504, generator_adv_loss=2.48, generator_feat_match_loss=4.308, over 2633.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:38:19,966 INFO [train.py:845] (3/4) Start epoch 324 2024-02-23 04:41:51,707 INFO [train.py:845] (3/4) Start epoch 325 2024-02-23 04:43:14,078 INFO [train.py:471] (3/4) Epoch 325, batch 12, global_batch_idx: 12000, batch size: 126, loss[discriminator_loss=2.734, discriminator_real_loss=1.422, discriminator_fake_loss=1.312, generator_loss=30.15, generator_mel_loss=21.31, generator_kl_loss=1.97, generator_dur_loss=1.492, generator_adv_loss=2.199, generator_feat_match_loss=3.18, over 126.00 samples.], tot_loss[discriminator_loss=2.704, discriminator_real_loss=1.358, discriminator_fake_loss=1.347, generator_loss=29.91, generator_mel_loss=20.97, generator_kl_loss=1.971, generator_dur_loss=1.501, generator_adv_loss=2.188, generator_feat_match_loss=3.28, over 1048.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2024-02-23 04:43:14,079 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 04:43:22,887 INFO [train.py:534] (3/4) Epoch 325, validation: discriminator_loss=2.592, discriminator_real_loss=1.371, discriminator_fake_loss=1.221, generator_loss=30.65, generator_mel_loss=21.59, generator_kl_loss=2.037, generator_dur_loss=1.493, generator_adv_loss=2.201, generator_feat_match_loss=3.336, over 100.00 samples. 2024-02-23 04:43:22,887 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 04:45:33,243 INFO [train.py:845] (3/4) Start epoch 326 2024-02-23 04:48:05,750 INFO [train.py:471] (3/4) Epoch 326, batch 25, global_batch_idx: 12050, batch size: 67, loss[discriminator_loss=2.23, discriminator_real_loss=1.149, discriminator_fake_loss=1.081, generator_loss=30.71, generator_mel_loss=20.11, generator_kl_loss=1.924, generator_dur_loss=1.524, generator_adv_loss=2.562, generator_feat_match_loss=4.582, over 67.00 samples.], tot_loss[discriminator_loss=2.44, discriminator_real_loss=1.226, discriminator_fake_loss=1.213, generator_loss=30.03, generator_mel_loss=20.26, generator_kl_loss=1.932, generator_dur_loss=1.503, generator_adv_loss=2.373, generator_feat_match_loss=3.962, over 1741.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:49:04,406 INFO [train.py:845] (3/4) Start epoch 327 2024-02-23 04:52:38,954 INFO [train.py:845] (3/4) Start epoch 328 2024-02-23 04:52:58,603 INFO [train.py:471] (3/4) Epoch 328, batch 1, global_batch_idx: 12100, batch size: 53, loss[discriminator_loss=2.418, discriminator_real_loss=1.219, discriminator_fake_loss=1.198, generator_loss=30.99, generator_mel_loss=20.51, generator_kl_loss=1.992, generator_dur_loss=1.523, generator_adv_loss=2.43, generator_feat_match_loss=4.535, over 53.00 samples.], tot_loss[discriminator_loss=2.448, discriminator_real_loss=1.29, discriminator_fake_loss=1.157, generator_loss=30.95, generator_mel_loss=20.47, generator_kl_loss=1.947, generator_dur_loss=1.524, generator_adv_loss=2.446, generator_feat_match_loss=4.561, over 116.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:56:07,349 INFO [train.py:845] (3/4) Start epoch 329 2024-02-23 04:57:31,839 INFO [train.py:471] (3/4) Epoch 329, batch 14, global_batch_idx: 12150, batch size: 73, loss[discriminator_loss=2.396, discriminator_real_loss=1.135, discriminator_fake_loss=1.262, generator_loss=29.69, generator_mel_loss=20.19, generator_kl_loss=1.865, generator_dur_loss=1.505, generator_adv_loss=2.26, generator_feat_match_loss=3.867, over 73.00 samples.], tot_loss[discriminator_loss=2.433, discriminator_real_loss=1.217, discriminator_fake_loss=1.216, generator_loss=29.75, generator_mel_loss=20.07, generator_kl_loss=1.936, generator_dur_loss=1.502, generator_adv_loss=2.344, generator_feat_match_loss=3.903, over 1173.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 04:59:29,599 INFO [train.py:845] (3/4) Start epoch 330 2024-02-23 05:02:11,446 INFO [train.py:471] (3/4) Epoch 330, batch 27, global_batch_idx: 12200, batch size: 53, loss[discriminator_loss=2.443, discriminator_real_loss=1.24, discriminator_fake_loss=1.203, generator_loss=28.98, generator_mel_loss=19.4, generator_kl_loss=1.896, generator_dur_loss=1.519, generator_adv_loss=2.381, generator_feat_match_loss=3.787, over 53.00 samples.], tot_loss[discriminator_loss=2.372, discriminator_real_loss=1.209, discriminator_fake_loss=1.163, generator_loss=30.36, generator_mel_loss=20.02, generator_kl_loss=1.932, generator_dur_loss=1.497, generator_adv_loss=2.504, generator_feat_match_loss=4.408, over 2359.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:02:11,447 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 05:02:19,792 INFO [train.py:534] (3/4) Epoch 330, validation: discriminator_loss=2.448, discriminator_real_loss=1.187, discriminator_fake_loss=1.261, generator_loss=30.16, generator_mel_loss=20.3, generator_kl_loss=2.144, generator_dur_loss=1.508, generator_adv_loss=2.25, generator_feat_match_loss=3.959, over 100.00 samples. 2024-02-23 05:02:19,793 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 05:03:05,432 INFO [train.py:845] (3/4) Start epoch 331 2024-02-23 05:06:29,957 INFO [train.py:845] (3/4) Start epoch 332 2024-02-23 05:06:57,349 INFO [train.py:471] (3/4) Epoch 332, batch 3, global_batch_idx: 12250, batch size: 56, loss[discriminator_loss=2.352, discriminator_real_loss=1.194, discriminator_fake_loss=1.157, generator_loss=30.46, generator_mel_loss=20.21, generator_kl_loss=1.971, generator_dur_loss=1.502, generator_adv_loss=2.467, generator_feat_match_loss=4.309, over 56.00 samples.], tot_loss[discriminator_loss=2.387, discriminator_real_loss=1.231, discriminator_fake_loss=1.156, generator_loss=30.52, generator_mel_loss=20.31, generator_kl_loss=1.944, generator_dur_loss=1.502, generator_adv_loss=2.446, generator_feat_match_loss=4.312, over 223.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:09:56,053 INFO [train.py:845] (3/4) Start epoch 333 2024-02-23 05:11:32,313 INFO [train.py:471] (3/4) Epoch 333, batch 16, global_batch_idx: 12300, batch size: 49, loss[discriminator_loss=2.973, discriminator_real_loss=1.447, discriminator_fake_loss=1.525, generator_loss=30.43, generator_mel_loss=19.71, generator_kl_loss=1.906, generator_dur_loss=1.524, generator_adv_loss=3.328, generator_feat_match_loss=3.963, over 49.00 samples.], tot_loss[discriminator_loss=2.403, discriminator_real_loss=1.217, discriminator_fake_loss=1.187, generator_loss=30.34, generator_mel_loss=19.95, generator_kl_loss=1.962, generator_dur_loss=1.499, generator_adv_loss=2.561, generator_feat_match_loss=4.365, over 1116.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:13:27,255 INFO [train.py:845] (3/4) Start epoch 334 2024-02-23 05:16:12,050 INFO [train.py:471] (3/4) Epoch 334, batch 29, global_batch_idx: 12350, batch size: 101, loss[discriminator_loss=2.715, discriminator_real_loss=1.31, discriminator_fake_loss=1.404, generator_loss=30.07, generator_mel_loss=20.49, generator_kl_loss=1.96, generator_dur_loss=1.495, generator_adv_loss=2.404, generator_feat_match_loss=3.715, over 101.00 samples.], tot_loss[discriminator_loss=2.43, discriminator_real_loss=1.211, discriminator_fake_loss=1.219, generator_loss=30.21, generator_mel_loss=20.1, generator_kl_loss=1.935, generator_dur_loss=1.498, generator_adv_loss=2.45, generator_feat_match_loss=4.233, over 2274.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:16:50,551 INFO [train.py:845] (3/4) Start epoch 335 2024-02-23 05:20:22,097 INFO [train.py:845] (3/4) Start epoch 336 2024-02-23 05:21:03,982 INFO [train.py:471] (3/4) Epoch 336, batch 5, global_batch_idx: 12400, batch size: 64, loss[discriminator_loss=2.303, discriminator_real_loss=1.172, discriminator_fake_loss=1.131, generator_loss=29.79, generator_mel_loss=19.64, generator_kl_loss=1.865, generator_dur_loss=1.481, generator_adv_loss=2.449, generator_feat_match_loss=4.363, over 64.00 samples.], tot_loss[discriminator_loss=2.325, discriminator_real_loss=1.164, discriminator_fake_loss=1.161, generator_loss=30.18, generator_mel_loss=19.84, generator_kl_loss=1.957, generator_dur_loss=1.502, generator_adv_loss=2.448, generator_feat_match_loss=4.432, over 459.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2024-02-23 05:21:03,984 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 05:21:13,072 INFO [train.py:534] (3/4) Epoch 336, validation: discriminator_loss=2.295, discriminator_real_loss=1.055, discriminator_fake_loss=1.24, generator_loss=31.13, generator_mel_loss=20.88, generator_kl_loss=1.995, generator_dur_loss=1.493, generator_adv_loss=2.297, generator_feat_match_loss=4.464, over 100.00 samples. 2024-02-23 05:21:13,072 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 05:24:00,841 INFO [train.py:845] (3/4) Start epoch 337 2024-02-23 05:25:50,222 INFO [train.py:471] (3/4) Epoch 337, batch 18, global_batch_idx: 12450, batch size: 61, loss[discriminator_loss=2.344, discriminator_real_loss=1.174, discriminator_fake_loss=1.17, generator_loss=29.85, generator_mel_loss=19.52, generator_kl_loss=1.881, generator_dur_loss=1.502, generator_adv_loss=2.512, generator_feat_match_loss=4.438, over 61.00 samples.], tot_loss[discriminator_loss=2.363, discriminator_real_loss=1.199, discriminator_fake_loss=1.164, generator_loss=30.21, generator_mel_loss=19.89, generator_kl_loss=1.942, generator_dur_loss=1.509, generator_adv_loss=2.472, generator_feat_match_loss=4.397, over 1198.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2024-02-23 05:27:27,240 INFO [train.py:845] (3/4) Start epoch 338 2024-02-23 05:30:24,520 INFO [train.py:471] (3/4) Epoch 338, batch 31, global_batch_idx: 12500, batch size: 52, loss[discriminator_loss=2.488, discriminator_real_loss=1.361, discriminator_fake_loss=1.127, generator_loss=29.98, generator_mel_loss=20.23, generator_kl_loss=1.926, generator_dur_loss=1.502, generator_adv_loss=2.312, generator_feat_match_loss=4.008, over 52.00 samples.], tot_loss[discriminator_loss=2.438, discriminator_real_loss=1.237, discriminator_fake_loss=1.201, generator_loss=30.39, generator_mel_loss=20.01, generator_kl_loss=1.923, generator_dur_loss=1.5, generator_adv_loss=2.515, generator_feat_match_loss=4.437, over 2324.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:30:55,828 INFO [train.py:845] (3/4) Start epoch 339 2024-02-23 05:34:24,519 INFO [train.py:845] (3/4) Start epoch 340 2024-02-23 05:35:19,487 INFO [train.py:471] (3/4) Epoch 340, batch 7, global_batch_idx: 12550, batch size: 51, loss[discriminator_loss=2.314, discriminator_real_loss=1.16, discriminator_fake_loss=1.154, generator_loss=30.31, generator_mel_loss=20, generator_kl_loss=1.877, generator_dur_loss=1.496, generator_adv_loss=2.418, generator_feat_match_loss=4.516, over 51.00 samples.], tot_loss[discriminator_loss=2.398, discriminator_real_loss=1.222, discriminator_fake_loss=1.176, generator_loss=29.98, generator_mel_loss=19.91, generator_kl_loss=1.943, generator_dur_loss=1.507, generator_adv_loss=2.433, generator_feat_match_loss=4.183, over 516.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:37:52,883 INFO [train.py:845] (3/4) Start epoch 341 2024-02-23 05:40:02,105 INFO [train.py:471] (3/4) Epoch 341, batch 20, global_batch_idx: 12600, batch size: 50, loss[discriminator_loss=2.309, discriminator_real_loss=1.103, discriminator_fake_loss=1.206, generator_loss=29.33, generator_mel_loss=19.1, generator_kl_loss=1.83, generator_dur_loss=1.524, generator_adv_loss=2.58, generator_feat_match_loss=4.293, over 50.00 samples.], tot_loss[discriminator_loss=2.432, discriminator_real_loss=1.223, discriminator_fake_loss=1.209, generator_loss=30.53, generator_mel_loss=19.84, generator_kl_loss=1.933, generator_dur_loss=1.501, generator_adv_loss=2.602, generator_feat_match_loss=4.66, over 1636.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:40:02,107 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 05:40:10,093 INFO [train.py:534] (3/4) Epoch 341, validation: discriminator_loss=2.626, discriminator_real_loss=1.247, discriminator_fake_loss=1.379, generator_loss=29.93, generator_mel_loss=20.65, generator_kl_loss=1.95, generator_dur_loss=1.49, generator_adv_loss=2.082, generator_feat_match_loss=3.759, over 100.00 samples. 2024-02-23 05:40:10,094 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 05:41:31,662 INFO [train.py:845] (3/4) Start epoch 342 2024-02-23 05:44:43,832 INFO [train.py:471] (3/4) Epoch 342, batch 33, global_batch_idx: 12650, batch size: 53, loss[discriminator_loss=2.305, discriminator_real_loss=1.146, discriminator_fake_loss=1.16, generator_loss=29.81, generator_mel_loss=19.53, generator_kl_loss=1.861, generator_dur_loss=1.501, generator_adv_loss=2.443, generator_feat_match_loss=4.477, over 53.00 samples.], tot_loss[discriminator_loss=2.416, discriminator_real_loss=1.221, discriminator_fake_loss=1.195, generator_loss=30.33, generator_mel_loss=20.17, generator_kl_loss=1.953, generator_dur_loss=1.494, generator_adv_loss=2.437, generator_feat_match_loss=4.272, over 2455.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:45:00,277 INFO [train.py:845] (3/4) Start epoch 343 2024-02-23 05:48:29,426 INFO [train.py:845] (3/4) Start epoch 344 2024-02-23 05:49:31,271 INFO [train.py:471] (3/4) Epoch 344, batch 9, global_batch_idx: 12700, batch size: 126, loss[discriminator_loss=2.582, discriminator_real_loss=1.332, discriminator_fake_loss=1.25, generator_loss=30.77, generator_mel_loss=20.29, generator_kl_loss=2.028, generator_dur_loss=1.521, generator_adv_loss=2.395, generator_feat_match_loss=4.543, over 126.00 samples.], tot_loss[discriminator_loss=2.387, discriminator_real_loss=1.228, discriminator_fake_loss=1.159, generator_loss=31.35, generator_mel_loss=20.3, generator_kl_loss=1.97, generator_dur_loss=1.503, generator_adv_loss=2.651, generator_feat_match_loss=4.924, over 819.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:52:00,218 INFO [train.py:845] (3/4) Start epoch 345 2024-02-23 05:54:07,564 INFO [train.py:471] (3/4) Epoch 345, batch 22, global_batch_idx: 12750, batch size: 76, loss[discriminator_loss=2.371, discriminator_real_loss=1.133, discriminator_fake_loss=1.239, generator_loss=30.02, generator_mel_loss=20.08, generator_kl_loss=1.988, generator_dur_loss=1.515, generator_adv_loss=2.32, generator_feat_match_loss=4.121, over 76.00 samples.], tot_loss[discriminator_loss=2.423, discriminator_real_loss=1.234, discriminator_fake_loss=1.189, generator_loss=30.05, generator_mel_loss=19.97, generator_kl_loss=1.937, generator_dur_loss=1.498, generator_adv_loss=2.418, generator_feat_match_loss=4.231, over 1798.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 16.0 2024-02-23 05:55:32,181 INFO [train.py:845] (3/4) Start epoch 346 2024-02-23 05:58:58,117 INFO [train.py:471] (3/4) Epoch 346, batch 35, global_batch_idx: 12800, batch size: 52, loss[discriminator_loss=2.227, discriminator_real_loss=1.086, discriminator_fake_loss=1.14, generator_loss=29.76, generator_mel_loss=19.23, generator_kl_loss=1.93, generator_dur_loss=1.495, generator_adv_loss=2.398, generator_feat_match_loss=4.703, over 52.00 samples.], tot_loss[discriminator_loss=2.469, discriminator_real_loss=1.245, discriminator_fake_loss=1.224, generator_loss=29.95, generator_mel_loss=20.14, generator_kl_loss=1.947, generator_dur_loss=1.496, generator_adv_loss=2.382, generator_feat_match_loss=3.988, over 2802.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2024-02-23 05:58:58,119 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 05:59:06,638 INFO [train.py:534] (3/4) Epoch 346, validation: discriminator_loss=2.175, discriminator_real_loss=0.9795, discriminator_fake_loss=1.196, generator_loss=31.71, generator_mel_loss=20.91, generator_kl_loss=1.973, generator_dur_loss=1.493, generator_adv_loss=2.358, generator_feat_match_loss=4.973, over 100.00 samples. 2024-02-23 05:59:06,639 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 05:59:11,531 INFO [train.py:845] (3/4) Start epoch 347 2024-02-23 06:02:37,053 INFO [train.py:845] (3/4) Start epoch 348 2024-02-23 06:03:47,305 INFO [train.py:471] (3/4) Epoch 348, batch 11, global_batch_idx: 12850, batch size: 71, loss[discriminator_loss=2.406, discriminator_real_loss=1.242, discriminator_fake_loss=1.163, generator_loss=29.64, generator_mel_loss=19.87, generator_kl_loss=1.999, generator_dur_loss=1.493, generator_adv_loss=2.414, generator_feat_match_loss=3.865, over 71.00 samples.], tot_loss[discriminator_loss=2.405, discriminator_real_loss=1.211, discriminator_fake_loss=1.193, generator_loss=29.99, generator_mel_loss=19.96, generator_kl_loss=1.968, generator_dur_loss=1.503, generator_adv_loss=2.424, generator_feat_match_loss=4.131, over 817.00 samples.], cur_lr_g: 1.92e-04, cur_lr_d: 1.92e-04, grad_scale: 32.0 2024-02-23 06:06:01,464 INFO [train.py:845] (3/4) Start epoch 349 2024-02-23 06:08:25,293 INFO [train.py:471] (3/4) Epoch 349, batch 24, global_batch_idx: 12900, batch size: 58, loss[discriminator_loss=2.432, discriminator_real_loss=1.232, discriminator_fake_loss=1.199, generator_loss=29.19, generator_mel_loss=19.49, generator_kl_loss=1.923, generator_dur_loss=1.519, generator_adv_loss=2.324, generator_feat_match_loss=3.938, over 58.00 samples.], tot_loss[discriminator_loss=2.406, discriminator_real_loss=1.199, discriminator_fake_loss=1.206, generator_loss=29.89, generator_mel_loss=19.93, generator_kl_loss=1.92, generator_dur_loss=1.497, generator_adv_loss=2.397, generator_feat_match_loss=4.15, over 1975.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 32.0 2024-02-23 06:09:24,818 INFO [train.py:845] (3/4) Start epoch 350 2024-02-23 06:12:54,575 INFO [train.py:845] (3/4) Start epoch 351 2024-02-23 06:13:08,151 INFO [train.py:471] (3/4) Epoch 351, batch 0, global_batch_idx: 12950, batch size: 90, loss[discriminator_loss=2.492, discriminator_real_loss=1.222, discriminator_fake_loss=1.271, generator_loss=29.2, generator_mel_loss=19.5, generator_kl_loss=1.992, generator_dur_loss=1.508, generator_adv_loss=2.32, generator_feat_match_loss=3.879, over 90.00 samples.], tot_loss[discriminator_loss=2.492, discriminator_real_loss=1.222, discriminator_fake_loss=1.271, generator_loss=29.2, generator_mel_loss=19.5, generator_kl_loss=1.992, generator_dur_loss=1.508, generator_adv_loss=2.32, generator_feat_match_loss=3.879, over 90.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 06:16:24,412 INFO [train.py:845] (3/4) Start epoch 352 2024-02-23 06:17:38,364 INFO [train.py:471] (3/4) Epoch 352, batch 13, global_batch_idx: 13000, batch size: 52, loss[discriminator_loss=2.16, discriminator_real_loss=1.14, discriminator_fake_loss=1.021, generator_loss=30.82, generator_mel_loss=20.02, generator_kl_loss=1.979, generator_dur_loss=1.504, generator_adv_loss=2.553, generator_feat_match_loss=4.766, over 52.00 samples.], tot_loss[discriminator_loss=2.359, discriminator_real_loss=1.202, discriminator_fake_loss=1.157, generator_loss=30.67, generator_mel_loss=20.04, generator_kl_loss=1.949, generator_dur_loss=1.5, generator_adv_loss=2.55, generator_feat_match_loss=4.636, over 962.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 06:17:38,365 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 06:17:46,857 INFO [train.py:534] (3/4) Epoch 352, validation: discriminator_loss=2.179, discriminator_real_loss=0.9878, discriminator_fake_loss=1.191, generator_loss=32.01, generator_mel_loss=21.16, generator_kl_loss=2, generator_dur_loss=1.499, generator_adv_loss=2.444, generator_feat_match_loss=4.908, over 100.00 samples. 2024-02-23 06:17:46,857 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 06:19:58,643 INFO [train.py:845] (3/4) Start epoch 353 2024-02-23 06:22:35,588 INFO [train.py:471] (3/4) Epoch 353, batch 26, global_batch_idx: 13050, batch size: 61, loss[discriminator_loss=2.742, discriminator_real_loss=1.545, discriminator_fake_loss=1.197, generator_loss=30.18, generator_mel_loss=20.01, generator_kl_loss=2.042, generator_dur_loss=1.487, generator_adv_loss=2.402, generator_feat_match_loss=4.242, over 61.00 samples.], tot_loss[discriminator_loss=2.418, discriminator_real_loss=1.259, discriminator_fake_loss=1.159, generator_loss=30.45, generator_mel_loss=19.88, generator_kl_loss=1.921, generator_dur_loss=1.505, generator_adv_loss=2.564, generator_feat_match_loss=4.587, over 1814.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 06:23:29,163 INFO [train.py:845] (3/4) Start epoch 354 2024-02-23 06:26:57,667 INFO [train.py:845] (3/4) Start epoch 355 2024-02-23 06:27:25,308 INFO [train.py:471] (3/4) Epoch 355, batch 2, global_batch_idx: 13100, batch size: 101, loss[discriminator_loss=2.453, discriminator_real_loss=1.198, discriminator_fake_loss=1.254, generator_loss=30.09, generator_mel_loss=19.95, generator_kl_loss=1.866, generator_dur_loss=1.47, generator_adv_loss=2.371, generator_feat_match_loss=4.438, over 101.00 samples.], tot_loss[discriminator_loss=2.531, discriminator_real_loss=1.236, discriminator_fake_loss=1.294, generator_loss=29.91, generator_mel_loss=19.94, generator_kl_loss=1.875, generator_dur_loss=1.485, generator_adv_loss=2.32, generator_feat_match_loss=4.292, over 209.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 06:30:23,498 INFO [train.py:845] (3/4) Start epoch 356 2024-02-23 06:31:59,784 INFO [train.py:471] (3/4) Epoch 356, batch 15, global_batch_idx: 13150, batch size: 51, loss[discriminator_loss=2.547, discriminator_real_loss=1.382, discriminator_fake_loss=1.166, generator_loss=29.49, generator_mel_loss=20.24, generator_kl_loss=1.867, generator_dur_loss=1.517, generator_adv_loss=2.541, generator_feat_match_loss=3.332, over 51.00 samples.], tot_loss[discriminator_loss=2.399, discriminator_real_loss=1.194, discriminator_fake_loss=1.205, generator_loss=29.64, generator_mel_loss=19.78, generator_kl_loss=1.944, generator_dur_loss=1.499, generator_adv_loss=2.374, generator_feat_match_loss=4.046, over 1117.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 06:33:50,081 INFO [train.py:845] (3/4) Start epoch 357 2024-02-23 06:36:27,719 INFO [train.py:471] (3/4) Epoch 357, batch 28, global_batch_idx: 13200, batch size: 76, loss[discriminator_loss=2.539, discriminator_real_loss=1.19, discriminator_fake_loss=1.348, generator_loss=29.81, generator_mel_loss=20.21, generator_kl_loss=1.984, generator_dur_loss=1.497, generator_adv_loss=2.305, generator_feat_match_loss=3.816, over 76.00 samples.], tot_loss[discriminator_loss=2.388, discriminator_real_loss=1.228, discriminator_fake_loss=1.161, generator_loss=30.44, generator_mel_loss=19.91, generator_kl_loss=1.93, generator_dur_loss=1.495, generator_adv_loss=2.559, generator_feat_match_loss=4.545, over 2165.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 32.0 2024-02-23 06:36:27,720 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 06:36:36,458 INFO [train.py:534] (3/4) Epoch 357, validation: discriminator_loss=2.469, discriminator_real_loss=1.151, discriminator_fake_loss=1.318, generator_loss=30.55, generator_mel_loss=20.84, generator_kl_loss=2.001, generator_dur_loss=1.496, generator_adv_loss=2.216, generator_feat_match_loss=4, over 100.00 samples. 2024-02-23 06:36:36,459 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 06:37:18,078 INFO [train.py:845] (3/4) Start epoch 358 2024-02-23 06:40:48,354 INFO [train.py:845] (3/4) Start epoch 359 2024-02-23 06:41:24,863 INFO [train.py:471] (3/4) Epoch 359, batch 4, global_batch_idx: 13250, batch size: 52, loss[discriminator_loss=2.422, discriminator_real_loss=1.294, discriminator_fake_loss=1.129, generator_loss=31.06, generator_mel_loss=20.41, generator_kl_loss=2.077, generator_dur_loss=1.509, generator_adv_loss=2.484, generator_feat_match_loss=4.582, over 52.00 samples.], tot_loss[discriminator_loss=2.63, discriminator_real_loss=1.383, discriminator_fake_loss=1.247, generator_loss=30.47, generator_mel_loss=20.14, generator_kl_loss=1.949, generator_dur_loss=1.492, generator_adv_loss=2.423, generator_feat_match_loss=4.462, over 382.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 32.0 2024-02-23 06:44:08,255 INFO [train.py:845] (3/4) Start epoch 360 2024-02-23 06:46:07,012 INFO [train.py:471] (3/4) Epoch 360, batch 17, global_batch_idx: 13300, batch size: 90, loss[discriminator_loss=2.324, discriminator_real_loss=1.183, discriminator_fake_loss=1.143, generator_loss=29.79, generator_mel_loss=19.65, generator_kl_loss=1.981, generator_dur_loss=1.485, generator_adv_loss=2.447, generator_feat_match_loss=4.227, over 90.00 samples.], tot_loss[discriminator_loss=2.407, discriminator_real_loss=1.187, discriminator_fake_loss=1.221, generator_loss=29.8, generator_mel_loss=19.9, generator_kl_loss=1.954, generator_dur_loss=1.498, generator_adv_loss=2.375, generator_feat_match_loss=4.077, over 1419.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 32.0 2024-02-23 06:47:43,195 INFO [train.py:845] (3/4) Start epoch 361 2024-02-23 06:50:45,482 INFO [train.py:471] (3/4) Epoch 361, batch 30, global_batch_idx: 13350, batch size: 65, loss[discriminator_loss=2.848, discriminator_real_loss=1.511, discriminator_fake_loss=1.338, generator_loss=28.09, generator_mel_loss=19.98, generator_kl_loss=2.035, generator_dur_loss=1.511, generator_adv_loss=1.828, generator_feat_match_loss=2.742, over 65.00 samples.], tot_loss[discriminator_loss=2.952, discriminator_real_loss=1.574, discriminator_fake_loss=1.378, generator_loss=28.91, generator_mel_loss=20.03, generator_kl_loss=1.953, generator_dur_loss=1.499, generator_adv_loss=2.227, generator_feat_match_loss=3.207, over 2238.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 32.0 2024-02-23 06:51:17,671 INFO [train.py:845] (3/4) Start epoch 362 2024-02-23 06:54:48,615 INFO [train.py:845] (3/4) Start epoch 363 2024-02-23 06:55:26,520 INFO [train.py:471] (3/4) Epoch 363, batch 6, global_batch_idx: 13400, batch size: 81, loss[discriminator_loss=2.535, discriminator_real_loss=1.37, discriminator_fake_loss=1.166, generator_loss=30.13, generator_mel_loss=20.97, generator_kl_loss=1.919, generator_dur_loss=1.498, generator_adv_loss=2.117, generator_feat_match_loss=3.635, over 81.00 samples.], tot_loss[discriminator_loss=2.57, discriminator_real_loss=1.302, discriminator_fake_loss=1.268, generator_loss=29.69, generator_mel_loss=20.63, generator_kl_loss=1.924, generator_dur_loss=1.507, generator_adv_loss=2.156, generator_feat_match_loss=3.481, over 455.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 06:55:26,521 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 06:55:35,282 INFO [train.py:534] (3/4) Epoch 363, validation: discriminator_loss=2.665, discriminator_real_loss=1.161, discriminator_fake_loss=1.504, generator_loss=29.77, generator_mel_loss=21.12, generator_kl_loss=2.113, generator_dur_loss=1.49, generator_adv_loss=1.811, generator_feat_match_loss=3.239, over 100.00 samples. 2024-02-23 06:55:35,283 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 06:58:26,505 INFO [train.py:845] (3/4) Start epoch 364 2024-02-23 07:00:27,291 INFO [train.py:471] (3/4) Epoch 364, batch 19, global_batch_idx: 13450, batch size: 85, loss[discriminator_loss=2.746, discriminator_real_loss=1.154, discriminator_fake_loss=1.592, generator_loss=28.4, generator_mel_loss=19.8, generator_kl_loss=1.911, generator_dur_loss=1.494, generator_adv_loss=2.139, generator_feat_match_loss=3.059, over 85.00 samples.], tot_loss[discriminator_loss=2.649, discriminator_real_loss=1.339, discriminator_fake_loss=1.31, generator_loss=29.65, generator_mel_loss=20.38, generator_kl_loss=1.928, generator_dur_loss=1.493, generator_adv_loss=2.244, generator_feat_match_loss=3.61, over 1725.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:01:58,704 INFO [train.py:845] (3/4) Start epoch 365 2024-02-23 07:05:02,238 INFO [train.py:471] (3/4) Epoch 365, batch 32, global_batch_idx: 13500, batch size: 63, loss[discriminator_loss=2.871, discriminator_real_loss=1.238, discriminator_fake_loss=1.632, generator_loss=29.36, generator_mel_loss=20.7, generator_kl_loss=1.927, generator_dur_loss=1.49, generator_adv_loss=1.86, generator_feat_match_loss=3.383, over 63.00 samples.], tot_loss[discriminator_loss=2.51, discriminator_real_loss=1.247, discriminator_fake_loss=1.263, generator_loss=29.95, generator_mel_loss=20.41, generator_kl_loss=1.922, generator_dur_loss=1.49, generator_adv_loss=2.261, generator_feat_match_loss=3.869, over 2508.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:05:24,440 INFO [train.py:845] (3/4) Start epoch 366 2024-02-23 07:08:53,018 INFO [train.py:845] (3/4) Start epoch 367 2024-02-23 07:09:50,975 INFO [train.py:471] (3/4) Epoch 367, batch 8, global_batch_idx: 13550, batch size: 79, loss[discriminator_loss=2.562, discriminator_real_loss=1.17, discriminator_fake_loss=1.393, generator_loss=29.38, generator_mel_loss=20.3, generator_kl_loss=1.934, generator_dur_loss=1.501, generator_adv_loss=2.109, generator_feat_match_loss=3.533, over 79.00 samples.], tot_loss[discriminator_loss=2.436, discriminator_real_loss=1.203, discriminator_fake_loss=1.233, generator_loss=30.13, generator_mel_loss=20.2, generator_kl_loss=1.926, generator_dur_loss=1.496, generator_adv_loss=2.347, generator_feat_match_loss=4.161, over 689.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2024-02-23 07:12:20,101 INFO [train.py:845] (3/4) Start epoch 368 2024-02-23 07:14:36,909 INFO [train.py:471] (3/4) Epoch 368, batch 21, global_batch_idx: 13600, batch size: 64, loss[discriminator_loss=2.416, discriminator_real_loss=1.296, discriminator_fake_loss=1.12, generator_loss=29.56, generator_mel_loss=20.2, generator_kl_loss=1.958, generator_dur_loss=1.485, generator_adv_loss=2.1, generator_feat_match_loss=3.82, over 64.00 samples.], tot_loss[discriminator_loss=2.702, discriminator_real_loss=1.389, discriminator_fake_loss=1.312, generator_loss=29.23, generator_mel_loss=20.24, generator_kl_loss=1.925, generator_dur_loss=1.5, generator_adv_loss=2.135, generator_feat_match_loss=3.429, over 1563.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:14:36,911 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 07:14:45,799 INFO [train.py:534] (3/4) Epoch 368, validation: discriminator_loss=2.368, discriminator_real_loss=1.078, discriminator_fake_loss=1.29, generator_loss=30.5, generator_mel_loss=20.97, generator_kl_loss=2.089, generator_dur_loss=1.49, generator_adv_loss=1.956, generator_feat_match_loss=3.988, over 100.00 samples. 2024-02-23 07:14:45,800 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 07:16:03,214 INFO [train.py:845] (3/4) Start epoch 369 2024-02-23 07:19:16,969 INFO [train.py:471] (3/4) Epoch 369, batch 34, global_batch_idx: 13650, batch size: 110, loss[discriminator_loss=2.227, discriminator_real_loss=1.033, discriminator_fake_loss=1.194, generator_loss=30.46, generator_mel_loss=20.04, generator_kl_loss=1.92, generator_dur_loss=1.51, generator_adv_loss=2.227, generator_feat_match_loss=4.77, over 110.00 samples.], tot_loss[discriminator_loss=2.429, discriminator_real_loss=1.215, discriminator_fake_loss=1.214, generator_loss=30.13, generator_mel_loss=20.04, generator_kl_loss=1.942, generator_dur_loss=1.492, generator_adv_loss=2.439, generator_feat_match_loss=4.212, over 2804.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:19:30,882 INFO [train.py:845] (3/4) Start epoch 370 2024-02-23 07:23:01,339 INFO [train.py:845] (3/4) Start epoch 371 2024-02-23 07:24:08,321 INFO [train.py:471] (3/4) Epoch 371, batch 10, global_batch_idx: 13700, batch size: 58, loss[discriminator_loss=2.648, discriminator_real_loss=1.378, discriminator_fake_loss=1.27, generator_loss=29.93, generator_mel_loss=19.89, generator_kl_loss=1.899, generator_dur_loss=1.502, generator_adv_loss=2.594, generator_feat_match_loss=4.047, over 58.00 samples.], tot_loss[discriminator_loss=2.279, discriminator_real_loss=1.092, discriminator_fake_loss=1.188, generator_loss=30.63, generator_mel_loss=20.15, generator_kl_loss=1.929, generator_dur_loss=1.491, generator_adv_loss=2.485, generator_feat_match_loss=4.572, over 844.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:26:27,791 INFO [train.py:845] (3/4) Start epoch 372 2024-02-23 07:28:45,223 INFO [train.py:471] (3/4) Epoch 372, batch 23, global_batch_idx: 13750, batch size: 101, loss[discriminator_loss=2.445, discriminator_real_loss=1.272, discriminator_fake_loss=1.174, generator_loss=30.11, generator_mel_loss=19.98, generator_kl_loss=1.99, generator_dur_loss=1.458, generator_adv_loss=2.512, generator_feat_match_loss=4.164, over 101.00 samples.], tot_loss[discriminator_loss=2.427, discriminator_real_loss=1.206, discriminator_fake_loss=1.222, generator_loss=29.98, generator_mel_loss=20.01, generator_kl_loss=1.943, generator_dur_loss=1.497, generator_adv_loss=2.377, generator_feat_match_loss=4.154, over 1849.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:29:55,576 INFO [train.py:845] (3/4) Start epoch 373 2024-02-23 07:33:23,722 INFO [train.py:471] (3/4) Epoch 373, batch 36, global_batch_idx: 13800, batch size: 110, loss[discriminator_loss=2.227, discriminator_real_loss=1.115, discriminator_fake_loss=1.111, generator_loss=31.1, generator_mel_loss=20.38, generator_kl_loss=1.935, generator_dur_loss=1.482, generator_adv_loss=2.457, generator_feat_match_loss=4.852, over 110.00 samples.], tot_loss[discriminator_loss=2.402, discriminator_real_loss=1.206, discriminator_fake_loss=1.196, generator_loss=30.2, generator_mel_loss=20, generator_kl_loss=1.933, generator_dur_loss=1.496, generator_adv_loss=2.428, generator_feat_match_loss=4.35, over 2602.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:33:23,724 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 07:33:32,094 INFO [train.py:534] (3/4) Epoch 373, validation: discriminator_loss=2.189, discriminator_real_loss=0.9941, discriminator_fake_loss=1.195, generator_loss=31.43, generator_mel_loss=20.77, generator_kl_loss=1.955, generator_dur_loss=1.489, generator_adv_loss=2.378, generator_feat_match_loss=4.845, over 100.00 samples. 2024-02-23 07:33:32,095 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 07:33:32,524 INFO [train.py:845] (3/4) Start epoch 374 2024-02-23 07:36:55,103 INFO [train.py:845] (3/4) Start epoch 375 2024-02-23 07:38:05,475 INFO [train.py:471] (3/4) Epoch 375, batch 12, global_batch_idx: 13850, batch size: 85, loss[discriminator_loss=2.383, discriminator_real_loss=1.306, discriminator_fake_loss=1.078, generator_loss=30.07, generator_mel_loss=20, generator_kl_loss=1.867, generator_dur_loss=1.493, generator_adv_loss=2.49, generator_feat_match_loss=4.227, over 85.00 samples.], tot_loss[discriminator_loss=2.408, discriminator_real_loss=1.213, discriminator_fake_loss=1.195, generator_loss=29.87, generator_mel_loss=19.81, generator_kl_loss=1.939, generator_dur_loss=1.505, generator_adv_loss=2.425, generator_feat_match_loss=4.197, over 917.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:40:20,467 INFO [train.py:845] (3/4) Start epoch 376 2024-02-23 07:42:44,676 INFO [train.py:471] (3/4) Epoch 376, batch 25, global_batch_idx: 13900, batch size: 60, loss[discriminator_loss=2.891, discriminator_real_loss=1.711, discriminator_fake_loss=1.179, generator_loss=30.43, generator_mel_loss=20.27, generator_kl_loss=1.992, generator_dur_loss=1.49, generator_adv_loss=2.607, generator_feat_match_loss=4.062, over 60.00 samples.], tot_loss[discriminator_loss=2.435, discriminator_real_loss=1.233, discriminator_fake_loss=1.201, generator_loss=30.46, generator_mel_loss=20.11, generator_kl_loss=1.924, generator_dur_loss=1.496, generator_adv_loss=2.518, generator_feat_match_loss=4.414, over 1997.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:43:44,226 INFO [train.py:845] (3/4) Start epoch 377 2024-02-23 07:47:11,809 INFO [train.py:845] (3/4) Start epoch 378 2024-02-23 07:47:31,313 INFO [train.py:471] (3/4) Epoch 378, batch 1, global_batch_idx: 13950, batch size: 52, loss[discriminator_loss=2.449, discriminator_real_loss=1.273, discriminator_fake_loss=1.175, generator_loss=29.75, generator_mel_loss=20.17, generator_kl_loss=1.898, generator_dur_loss=1.51, generator_adv_loss=2.43, generator_feat_match_loss=3.746, over 52.00 samples.], tot_loss[discriminator_loss=2.471, discriminator_real_loss=1.275, discriminator_fake_loss=1.196, generator_loss=29.66, generator_mel_loss=19.97, generator_kl_loss=1.918, generator_dur_loss=1.517, generator_adv_loss=2.427, generator_feat_match_loss=3.83, over 107.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:50:47,052 INFO [train.py:845] (3/4) Start epoch 379 2024-02-23 07:52:12,004 INFO [train.py:471] (3/4) Epoch 379, batch 14, global_batch_idx: 14000, batch size: 76, loss[discriminator_loss=2.209, discriminator_real_loss=1.034, discriminator_fake_loss=1.175, generator_loss=30.04, generator_mel_loss=19.5, generator_kl_loss=1.865, generator_dur_loss=1.497, generator_adv_loss=2.412, generator_feat_match_loss=4.773, over 76.00 samples.], tot_loss[discriminator_loss=2.456, discriminator_real_loss=1.224, discriminator_fake_loss=1.232, generator_loss=29.95, generator_mel_loss=20.07, generator_kl_loss=1.942, generator_dur_loss=1.496, generator_adv_loss=2.373, generator_feat_match_loss=4.063, over 1017.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 32.0 2024-02-23 07:52:12,005 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 07:52:20,428 INFO [train.py:534] (3/4) Epoch 379, validation: discriminator_loss=2.154, discriminator_real_loss=0.9438, discriminator_fake_loss=1.21, generator_loss=31.7, generator_mel_loss=20.77, generator_kl_loss=2.049, generator_dur_loss=1.48, generator_adv_loss=2.359, generator_feat_match_loss=5.046, over 100.00 samples. 2024-02-23 07:52:20,429 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 07:54:21,494 INFO [train.py:845] (3/4) Start epoch 380 2024-02-23 07:56:50,609 INFO [train.py:471] (3/4) Epoch 380, batch 27, global_batch_idx: 14050, batch size: 79, loss[discriminator_loss=2.723, discriminator_real_loss=1.54, discriminator_fake_loss=1.182, generator_loss=29.77, generator_mel_loss=20.03, generator_kl_loss=1.897, generator_dur_loss=1.525, generator_adv_loss=2.545, generator_feat_match_loss=3.777, over 79.00 samples.], tot_loss[discriminator_loss=2.508, discriminator_real_loss=1.254, discriminator_fake_loss=1.254, generator_loss=29.75, generator_mel_loss=20.05, generator_kl_loss=1.917, generator_dur_loss=1.504, generator_adv_loss=2.351, generator_feat_match_loss=3.931, over 1902.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 07:57:41,838 INFO [train.py:845] (3/4) Start epoch 381 2024-02-23 08:01:08,164 INFO [train.py:845] (3/4) Start epoch 382 2024-02-23 08:01:37,740 INFO [train.py:471] (3/4) Epoch 382, batch 3, global_batch_idx: 14100, batch size: 59, loss[discriminator_loss=2.758, discriminator_real_loss=1.517, discriminator_fake_loss=1.242, generator_loss=29.85, generator_mel_loss=19.94, generator_kl_loss=1.941, generator_dur_loss=1.492, generator_adv_loss=2.506, generator_feat_match_loss=3.969, over 59.00 samples.], tot_loss[discriminator_loss=2.57, discriminator_real_loss=1.277, discriminator_fake_loss=1.293, generator_loss=29.83, generator_mel_loss=20.35, generator_kl_loss=1.902, generator_dur_loss=1.491, generator_adv_loss=2.255, generator_feat_match_loss=3.825, over 351.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2024-02-23 08:04:36,322 INFO [train.py:845] (3/4) Start epoch 383 2024-02-23 08:06:21,124 INFO [train.py:471] (3/4) Epoch 383, batch 16, global_batch_idx: 14150, batch size: 63, loss[discriminator_loss=2.523, discriminator_real_loss=1.341, discriminator_fake_loss=1.182, generator_loss=29.35, generator_mel_loss=19.7, generator_kl_loss=1.876, generator_dur_loss=1.506, generator_adv_loss=2.445, generator_feat_match_loss=3.828, over 63.00 samples.], tot_loss[discriminator_loss=2.492, discriminator_real_loss=1.251, discriminator_fake_loss=1.242, generator_loss=29.67, generator_mel_loss=19.89, generator_kl_loss=1.929, generator_dur_loss=1.491, generator_adv_loss=2.319, generator_feat_match_loss=4.036, over 1273.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2024-02-23 08:08:09,099 INFO [train.py:845] (3/4) Start epoch 384 2024-02-23 08:11:02,656 INFO [train.py:471] (3/4) Epoch 384, batch 29, global_batch_idx: 14200, batch size: 73, loss[discriminator_loss=2.285, discriminator_real_loss=1.169, discriminator_fake_loss=1.117, generator_loss=30.44, generator_mel_loss=19.99, generator_kl_loss=1.868, generator_dur_loss=1.511, generator_adv_loss=2.441, generator_feat_match_loss=4.629, over 73.00 samples.], tot_loss[discriminator_loss=2.425, discriminator_real_loss=1.207, discriminator_fake_loss=1.218, generator_loss=29.97, generator_mel_loss=20.02, generator_kl_loss=1.917, generator_dur_loss=1.497, generator_adv_loss=2.367, generator_feat_match_loss=4.161, over 2290.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2024-02-23 08:11:02,658 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 08:11:10,780 INFO [train.py:534] (3/4) Epoch 384, validation: discriminator_loss=2.272, discriminator_real_loss=1.044, discriminator_fake_loss=1.228, generator_loss=30.95, generator_mel_loss=20.51, generator_kl_loss=1.974, generator_dur_loss=1.491, generator_adv_loss=2.349, generator_feat_match_loss=4.633, over 100.00 samples. 2024-02-23 08:11:10,781 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 08:11:45,318 INFO [train.py:845] (3/4) Start epoch 385 2024-02-23 08:15:14,442 INFO [train.py:845] (3/4) Start epoch 386 2024-02-23 08:15:54,561 INFO [train.py:471] (3/4) Epoch 386, batch 5, global_batch_idx: 14250, batch size: 58, loss[discriminator_loss=2.207, discriminator_real_loss=1.144, discriminator_fake_loss=1.062, generator_loss=30.57, generator_mel_loss=19.47, generator_kl_loss=1.907, generator_dur_loss=1.516, generator_adv_loss=2.494, generator_feat_match_loss=5.18, over 58.00 samples.], tot_loss[discriminator_loss=2.311, discriminator_real_loss=1.202, discriminator_fake_loss=1.109, generator_loss=30.79, generator_mel_loss=19.96, generator_kl_loss=1.963, generator_dur_loss=1.489, generator_adv_loss=2.538, generator_feat_match_loss=4.846, over 458.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2024-02-23 08:18:45,040 INFO [train.py:845] (3/4) Start epoch 387 2024-02-23 08:20:36,750 INFO [train.py:471] (3/4) Epoch 387, batch 18, global_batch_idx: 14300, batch size: 54, loss[discriminator_loss=2.426, discriminator_real_loss=1.218, discriminator_fake_loss=1.208, generator_loss=29.21, generator_mel_loss=19.48, generator_kl_loss=1.873, generator_dur_loss=1.493, generator_adv_loss=2.445, generator_feat_match_loss=3.916, over 54.00 samples.], tot_loss[discriminator_loss=2.404, discriminator_real_loss=1.221, discriminator_fake_loss=1.182, generator_loss=29.81, generator_mel_loss=19.75, generator_kl_loss=1.932, generator_dur_loss=1.496, generator_adv_loss=2.418, generator_feat_match_loss=4.219, over 1252.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2024-02-23 08:22:14,625 INFO [train.py:845] (3/4) Start epoch 388 2024-02-23 08:25:21,581 INFO [train.py:471] (3/4) Epoch 388, batch 31, global_batch_idx: 14350, batch size: 52, loss[discriminator_loss=2.32, discriminator_real_loss=1.133, discriminator_fake_loss=1.188, generator_loss=30.33, generator_mel_loss=20, generator_kl_loss=2.005, generator_dur_loss=1.473, generator_adv_loss=2.516, generator_feat_match_loss=4.336, over 52.00 samples.], tot_loss[discriminator_loss=2.351, discriminator_real_loss=1.193, discriminator_fake_loss=1.159, generator_loss=30.24, generator_mel_loss=19.79, generator_kl_loss=1.927, generator_dur_loss=1.491, generator_adv_loss=2.482, generator_feat_match_loss=4.552, over 2384.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 8.0 2024-02-23 08:25:43,563 INFO [train.py:845] (3/4) Start epoch 389 2024-02-23 08:29:12,050 INFO [train.py:845] (3/4) Start epoch 390 2024-02-23 08:29:59,953 INFO [train.py:471] (3/4) Epoch 390, batch 7, global_batch_idx: 14400, batch size: 53, loss[discriminator_loss=2.551, discriminator_real_loss=1.377, discriminator_fake_loss=1.174, generator_loss=29.99, generator_mel_loss=20.24, generator_kl_loss=1.85, generator_dur_loss=1.502, generator_adv_loss=2.414, generator_feat_match_loss=3.984, over 53.00 samples.], tot_loss[discriminator_loss=2.555, discriminator_real_loss=1.341, discriminator_fake_loss=1.215, generator_loss=29.46, generator_mel_loss=19.85, generator_kl_loss=1.883, generator_dur_loss=1.502, generator_adv_loss=2.435, generator_feat_match_loss=3.791, over 567.00 samples.], cur_lr_g: 1.91e-04, cur_lr_d: 1.91e-04, grad_scale: 16.0 2024-02-23 08:29:59,955 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 08:30:09,079 INFO [train.py:534] (3/4) Epoch 390, validation: discriminator_loss=2.552, discriminator_real_loss=1.306, discriminator_fake_loss=1.247, generator_loss=30.48, generator_mel_loss=20.58, generator_kl_loss=2.092, generator_dur_loss=1.486, generator_adv_loss=2.381, generator_feat_match_loss=3.942, over 100.00 samples. 2024-02-23 08:30:09,080 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 08:32:50,889 INFO [train.py:845] (3/4) Start epoch 391 2024-02-23 08:34:51,544 INFO [train.py:471] (3/4) Epoch 391, batch 20, global_batch_idx: 14450, batch size: 49, loss[discriminator_loss=2.348, discriminator_real_loss=1.156, discriminator_fake_loss=1.192, generator_loss=29.1, generator_mel_loss=19.08, generator_kl_loss=1.946, generator_dur_loss=1.506, generator_adv_loss=2.344, generator_feat_match_loss=4.23, over 49.00 samples.], tot_loss[discriminator_loss=2.375, discriminator_real_loss=1.183, discriminator_fake_loss=1.193, generator_loss=29.57, generator_mel_loss=19.58, generator_kl_loss=1.937, generator_dur_loss=1.496, generator_adv_loss=2.384, generator_feat_match_loss=4.167, over 1377.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 08:36:20,008 INFO [train.py:845] (3/4) Start epoch 392 2024-02-23 08:39:25,241 INFO [train.py:471] (3/4) Epoch 392, batch 33, global_batch_idx: 14500, batch size: 101, loss[discriminator_loss=2.402, discriminator_real_loss=1.225, discriminator_fake_loss=1.179, generator_loss=30.12, generator_mel_loss=19.62, generator_kl_loss=1.981, generator_dur_loss=1.48, generator_adv_loss=2.41, generator_feat_match_loss=4.625, over 101.00 samples.], tot_loss[discriminator_loss=2.418, discriminator_real_loss=1.232, discriminator_fake_loss=1.186, generator_loss=30.31, generator_mel_loss=19.88, generator_kl_loss=1.948, generator_dur_loss=1.492, generator_adv_loss=2.481, generator_feat_match_loss=4.515, over 2702.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 08:39:46,826 INFO [train.py:845] (3/4) Start epoch 393 2024-02-23 08:43:08,513 INFO [train.py:845] (3/4) Start epoch 394 2024-02-23 08:44:14,905 INFO [train.py:471] (3/4) Epoch 394, batch 9, global_batch_idx: 14550, batch size: 95, loss[discriminator_loss=2.457, discriminator_real_loss=1.301, discriminator_fake_loss=1.156, generator_loss=29.55, generator_mel_loss=19.52, generator_kl_loss=1.931, generator_dur_loss=1.473, generator_adv_loss=2.443, generator_feat_match_loss=4.184, over 95.00 samples.], tot_loss[discriminator_loss=2.339, discriminator_real_loss=1.163, discriminator_fake_loss=1.176, generator_loss=30.04, generator_mel_loss=19.62, generator_kl_loss=1.935, generator_dur_loss=1.489, generator_adv_loss=2.459, generator_feat_match_loss=4.544, over 772.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 08:46:35,849 INFO [train.py:845] (3/4) Start epoch 395 2024-02-23 08:48:49,588 INFO [train.py:471] (3/4) Epoch 395, batch 22, global_batch_idx: 14600, batch size: 61, loss[discriminator_loss=2.586, discriminator_real_loss=1.596, discriminator_fake_loss=0.9893, generator_loss=29.76, generator_mel_loss=19.55, generator_kl_loss=1.94, generator_dur_loss=1.499, generator_adv_loss=2.506, generator_feat_match_loss=4.27, over 61.00 samples.], tot_loss[discriminator_loss=2.392, discriminator_real_loss=1.212, discriminator_fake_loss=1.179, generator_loss=29.92, generator_mel_loss=19.6, generator_kl_loss=1.929, generator_dur_loss=1.497, generator_adv_loss=2.497, generator_feat_match_loss=4.405, over 1596.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 08:48:49,589 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 08:48:57,671 INFO [train.py:534] (3/4) Epoch 395, validation: discriminator_loss=2.34, discriminator_real_loss=1.187, discriminator_fake_loss=1.153, generator_loss=31.09, generator_mel_loss=20.58, generator_kl_loss=2.074, generator_dur_loss=1.496, generator_adv_loss=2.426, generator_feat_match_loss=4.516, over 100.00 samples. 2024-02-23 08:48:57,671 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 08:50:15,450 INFO [train.py:845] (3/4) Start epoch 396 2024-02-23 08:53:35,213 INFO [train.py:471] (3/4) Epoch 396, batch 35, global_batch_idx: 14650, batch size: 79, loss[discriminator_loss=2.584, discriminator_real_loss=1.23, discriminator_fake_loss=1.354, generator_loss=28.74, generator_mel_loss=19.48, generator_kl_loss=1.936, generator_dur_loss=1.478, generator_adv_loss=2.5, generator_feat_match_loss=3.348, over 79.00 samples.], tot_loss[discriminator_loss=2.421, discriminator_real_loss=1.226, discriminator_fake_loss=1.195, generator_loss=29.92, generator_mel_loss=19.73, generator_kl_loss=1.944, generator_dur_loss=1.496, generator_adv_loss=2.445, generator_feat_match_loss=4.298, over 2609.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 08:53:42,540 INFO [train.py:845] (3/4) Start epoch 397 2024-02-23 08:57:14,876 INFO [train.py:845] (3/4) Start epoch 398 2024-02-23 08:58:29,148 INFO [train.py:471] (3/4) Epoch 398, batch 11, global_batch_idx: 14700, batch size: 126, loss[discriminator_loss=2.232, discriminator_real_loss=1.199, discriminator_fake_loss=1.033, generator_loss=31.26, generator_mel_loss=20.14, generator_kl_loss=1.982, generator_dur_loss=1.448, generator_adv_loss=2.551, generator_feat_match_loss=5.141, over 126.00 samples.], tot_loss[discriminator_loss=2.391, discriminator_real_loss=1.217, discriminator_fake_loss=1.175, generator_loss=30.21, generator_mel_loss=19.91, generator_kl_loss=1.955, generator_dur_loss=1.487, generator_adv_loss=2.442, generator_feat_match_loss=4.417, over 922.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 09:00:44,224 INFO [train.py:845] (3/4) Start epoch 399 2024-02-23 09:03:07,705 INFO [train.py:471] (3/4) Epoch 399, batch 24, global_batch_idx: 14750, batch size: 52, loss[discriminator_loss=2.428, discriminator_real_loss=1.229, discriminator_fake_loss=1.198, generator_loss=29.38, generator_mel_loss=19.25, generator_kl_loss=1.92, generator_dur_loss=1.511, generator_adv_loss=2.475, generator_feat_match_loss=4.219, over 52.00 samples.], tot_loss[discriminator_loss=2.369, discriminator_real_loss=1.195, discriminator_fake_loss=1.175, generator_loss=30, generator_mel_loss=19.63, generator_kl_loss=1.927, generator_dur_loss=1.492, generator_adv_loss=2.468, generator_feat_match_loss=4.481, over 1754.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 09:04:16,810 INFO [train.py:845] (3/4) Start epoch 400 2024-02-23 09:07:45,285 INFO [train.py:845] (3/4) Start epoch 401 2024-02-23 09:07:57,553 INFO [train.py:471] (3/4) Epoch 401, batch 0, global_batch_idx: 14800, batch size: 52, loss[discriminator_loss=2.496, discriminator_real_loss=1.079, discriminator_fake_loss=1.418, generator_loss=30.01, generator_mel_loss=19.56, generator_kl_loss=2.076, generator_dur_loss=1.486, generator_adv_loss=2.639, generator_feat_match_loss=4.246, over 52.00 samples.], tot_loss[discriminator_loss=2.496, discriminator_real_loss=1.079, discriminator_fake_loss=1.418, generator_loss=30.01, generator_mel_loss=19.56, generator_kl_loss=2.076, generator_dur_loss=1.486, generator_adv_loss=2.639, generator_feat_match_loss=4.246, over 52.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 09:07:57,554 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 09:08:06,498 INFO [train.py:534] (3/4) Epoch 401, validation: discriminator_loss=2.494, discriminator_real_loss=1.313, discriminator_fake_loss=1.181, generator_loss=30.59, generator_mel_loss=20.49, generator_kl_loss=2.015, generator_dur_loss=1.485, generator_adv_loss=2.458, generator_feat_match_loss=4.144, over 100.00 samples. 2024-02-23 09:08:06,499 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 09:11:16,549 INFO [train.py:845] (3/4) Start epoch 402 2024-02-23 09:12:42,690 INFO [train.py:471] (3/4) Epoch 402, batch 13, global_batch_idx: 14850, batch size: 69, loss[discriminator_loss=2.449, discriminator_real_loss=1.191, discriminator_fake_loss=1.258, generator_loss=29.78, generator_mel_loss=19.65, generator_kl_loss=1.963, generator_dur_loss=1.499, generator_adv_loss=2.355, generator_feat_match_loss=4.312, over 69.00 samples.], tot_loss[discriminator_loss=2.429, discriminator_real_loss=1.208, discriminator_fake_loss=1.221, generator_loss=29.92, generator_mel_loss=19.87, generator_kl_loss=1.956, generator_dur_loss=1.496, generator_adv_loss=2.38, generator_feat_match_loss=4.222, over 980.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 09:14:44,605 INFO [train.py:845] (3/4) Start epoch 403 2024-02-23 09:17:10,108 INFO [train.py:471] (3/4) Epoch 403, batch 26, global_batch_idx: 14900, batch size: 79, loss[discriminator_loss=2.48, discriminator_real_loss=1.278, discriminator_fake_loss=1.201, generator_loss=29.24, generator_mel_loss=19.78, generator_kl_loss=1.847, generator_dur_loss=1.517, generator_adv_loss=2.289, generator_feat_match_loss=3.799, over 79.00 samples.], tot_loss[discriminator_loss=2.495, discriminator_real_loss=1.263, discriminator_fake_loss=1.231, generator_loss=29.97, generator_mel_loss=20.1, generator_kl_loss=1.922, generator_dur_loss=1.491, generator_adv_loss=2.372, generator_feat_match_loss=4.088, over 2097.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 09:18:08,314 INFO [train.py:845] (3/4) Start epoch 404 2024-02-23 09:21:32,291 INFO [train.py:845] (3/4) Start epoch 405 2024-02-23 09:22:00,028 INFO [train.py:471] (3/4) Epoch 405, batch 2, global_batch_idx: 14950, batch size: 95, loss[discriminator_loss=2.309, discriminator_real_loss=1.141, discriminator_fake_loss=1.169, generator_loss=30.26, generator_mel_loss=19.77, generator_kl_loss=1.944, generator_dur_loss=1.445, generator_adv_loss=2.383, generator_feat_match_loss=4.719, over 95.00 samples.], tot_loss[discriminator_loss=2.3, discriminator_real_loss=1.129, discriminator_fake_loss=1.172, generator_loss=30.2, generator_mel_loss=19.63, generator_kl_loss=1.947, generator_dur_loss=1.486, generator_adv_loss=2.432, generator_feat_match_loss=4.705, over 201.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 09:25:01,907 INFO [train.py:845] (3/4) Start epoch 406 2024-02-23 09:26:35,559 INFO [train.py:471] (3/4) Epoch 406, batch 15, global_batch_idx: 15000, batch size: 55, loss[discriminator_loss=2.414, discriminator_real_loss=1.241, discriminator_fake_loss=1.173, generator_loss=29.18, generator_mel_loss=19.72, generator_kl_loss=1.895, generator_dur_loss=1.508, generator_adv_loss=2.352, generator_feat_match_loss=3.709, over 55.00 samples.], tot_loss[discriminator_loss=2.499, discriminator_real_loss=1.267, discriminator_fake_loss=1.231, generator_loss=29.17, generator_mel_loss=19.73, generator_kl_loss=1.929, generator_dur_loss=1.499, generator_adv_loss=2.319, generator_feat_match_loss=3.696, over 942.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 09:26:35,561 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 09:26:44,525 INFO [train.py:534] (3/4) Epoch 406, validation: discriminator_loss=2.564, discriminator_real_loss=1.197, discriminator_fake_loss=1.367, generator_loss=29.36, generator_mel_loss=20.37, generator_kl_loss=2.038, generator_dur_loss=1.485, generator_adv_loss=2.035, generator_feat_match_loss=3.432, over 100.00 samples. 2024-02-23 09:26:44,526 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 09:28:45,043 INFO [train.py:845] (3/4) Start epoch 407 2024-02-23 09:31:30,541 INFO [train.py:471] (3/4) Epoch 407, batch 28, global_batch_idx: 15050, batch size: 81, loss[discriminator_loss=2.32, discriminator_real_loss=1.107, discriminator_fake_loss=1.213, generator_loss=29.47, generator_mel_loss=19.32, generator_kl_loss=1.913, generator_dur_loss=1.479, generator_adv_loss=2.402, generator_feat_match_loss=4.352, over 81.00 samples.], tot_loss[discriminator_loss=2.389, discriminator_real_loss=1.189, discriminator_fake_loss=1.201, generator_loss=29.79, generator_mel_loss=19.68, generator_kl_loss=1.944, generator_dur_loss=1.49, generator_adv_loss=2.394, generator_feat_match_loss=4.283, over 2229.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 09:32:14,943 INFO [train.py:845] (3/4) Start epoch 408 2024-02-23 09:35:39,925 INFO [train.py:845] (3/4) Start epoch 409 2024-02-23 09:36:13,492 INFO [train.py:471] (3/4) Epoch 409, batch 4, global_batch_idx: 15100, batch size: 58, loss[discriminator_loss=2.383, discriminator_real_loss=1.183, discriminator_fake_loss=1.199, generator_loss=30.32, generator_mel_loss=19.71, generator_kl_loss=1.885, generator_dur_loss=1.513, generator_adv_loss=2.393, generator_feat_match_loss=4.82, over 58.00 samples.], tot_loss[discriminator_loss=2.354, discriminator_real_loss=1.19, discriminator_fake_loss=1.164, generator_loss=30.2, generator_mel_loss=19.62, generator_kl_loss=1.944, generator_dur_loss=1.489, generator_adv_loss=2.461, generator_feat_match_loss=4.689, over 341.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 09:39:08,523 INFO [train.py:845] (3/4) Start epoch 410 2024-02-23 09:40:55,908 INFO [train.py:471] (3/4) Epoch 410, batch 17, global_batch_idx: 15150, batch size: 53, loss[discriminator_loss=2.32, discriminator_real_loss=1.153, discriminator_fake_loss=1.168, generator_loss=29.98, generator_mel_loss=19.67, generator_kl_loss=1.934, generator_dur_loss=1.505, generator_adv_loss=2.469, generator_feat_match_loss=4.402, over 53.00 samples.], tot_loss[discriminator_loss=2.35, discriminator_real_loss=1.173, discriminator_fake_loss=1.176, generator_loss=29.99, generator_mel_loss=19.6, generator_kl_loss=1.91, generator_dur_loss=1.496, generator_adv_loss=2.443, generator_feat_match_loss=4.547, over 1280.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 09:42:32,958 INFO [train.py:845] (3/4) Start epoch 411 2024-02-23 09:45:31,402 INFO [train.py:471] (3/4) Epoch 411, batch 30, global_batch_idx: 15200, batch size: 49, loss[discriminator_loss=2.379, discriminator_real_loss=1.161, discriminator_fake_loss=1.217, generator_loss=29.18, generator_mel_loss=19.23, generator_kl_loss=1.959, generator_dur_loss=1.532, generator_adv_loss=2.332, generator_feat_match_loss=4.125, over 49.00 samples.], tot_loss[discriminator_loss=2.43, discriminator_real_loss=1.238, discriminator_fake_loss=1.192, generator_loss=30.32, generator_mel_loss=19.79, generator_kl_loss=1.921, generator_dur_loss=1.496, generator_adv_loss=2.498, generator_feat_match_loss=4.612, over 2381.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 09:45:31,403 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 09:45:39,929 INFO [train.py:534] (3/4) Epoch 411, validation: discriminator_loss=2.628, discriminator_real_loss=1.115, discriminator_fake_loss=1.513, generator_loss=29.01, generator_mel_loss=20.05, generator_kl_loss=2.042, generator_dur_loss=1.485, generator_adv_loss=1.87, generator_feat_match_loss=3.568, over 100.00 samples. 2024-02-23 09:45:39,930 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 09:46:09,421 INFO [train.py:845] (3/4) Start epoch 412 2024-02-23 09:49:38,570 INFO [train.py:845] (3/4) Start epoch 413 2024-02-23 09:50:23,258 INFO [train.py:471] (3/4) Epoch 413, batch 6, global_batch_idx: 15250, batch size: 76, loss[discriminator_loss=2.682, discriminator_real_loss=0.9692, discriminator_fake_loss=1.713, generator_loss=29.2, generator_mel_loss=19.26, generator_kl_loss=1.922, generator_dur_loss=1.497, generator_adv_loss=2.48, generator_feat_match_loss=4.039, over 76.00 samples.], tot_loss[discriminator_loss=2.338, discriminator_real_loss=1.144, discriminator_fake_loss=1.194, generator_loss=30.47, generator_mel_loss=19.64, generator_kl_loss=1.933, generator_dur_loss=1.491, generator_adv_loss=2.636, generator_feat_match_loss=4.772, over 558.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 09:53:07,500 INFO [train.py:845] (3/4) Start epoch 414 2024-02-23 09:54:56,393 INFO [train.py:471] (3/4) Epoch 414, batch 19, global_batch_idx: 15300, batch size: 52, loss[discriminator_loss=2.529, discriminator_real_loss=1.359, discriminator_fake_loss=1.17, generator_loss=29.6, generator_mel_loss=19.25, generator_kl_loss=1.853, generator_dur_loss=1.495, generator_adv_loss=2.555, generator_feat_match_loss=4.445, over 52.00 samples.], tot_loss[discriminator_loss=2.345, discriminator_real_loss=1.183, discriminator_fake_loss=1.162, generator_loss=30.1, generator_mel_loss=19.6, generator_kl_loss=1.947, generator_dur_loss=1.488, generator_adv_loss=2.484, generator_feat_match_loss=4.578, over 1530.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 09:56:32,822 INFO [train.py:845] (3/4) Start epoch 415 2024-02-23 09:59:37,499 INFO [train.py:471] (3/4) Epoch 415, batch 32, global_batch_idx: 15350, batch size: 81, loss[discriminator_loss=2.598, discriminator_real_loss=1.512, discriminator_fake_loss=1.086, generator_loss=30.04, generator_mel_loss=19.47, generator_kl_loss=1.961, generator_dur_loss=1.488, generator_adv_loss=2.525, generator_feat_match_loss=4.598, over 81.00 samples.], tot_loss[discriminator_loss=2.41, discriminator_real_loss=1.22, discriminator_fake_loss=1.19, generator_loss=29.83, generator_mel_loss=19.68, generator_kl_loss=1.934, generator_dur_loss=1.488, generator_adv_loss=2.434, generator_feat_match_loss=4.296, over 2347.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 10:00:03,419 INFO [train.py:845] (3/4) Start epoch 416 2024-02-23 10:03:28,416 INFO [train.py:845] (3/4) Start epoch 417 2024-02-23 10:04:23,132 INFO [train.py:471] (3/4) Epoch 417, batch 8, global_batch_idx: 15400, batch size: 59, loss[discriminator_loss=2.486, discriminator_real_loss=1.199, discriminator_fake_loss=1.287, generator_loss=30.11, generator_mel_loss=20.2, generator_kl_loss=1.923, generator_dur_loss=1.497, generator_adv_loss=2.35, generator_feat_match_loss=4.145, over 59.00 samples.], tot_loss[discriminator_loss=2.432, discriminator_real_loss=1.205, discriminator_fake_loss=1.227, generator_loss=29.87, generator_mel_loss=19.88, generator_kl_loss=1.889, generator_dur_loss=1.499, generator_adv_loss=2.377, generator_feat_match_loss=4.23, over 598.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 10:04:23,134 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 10:04:32,372 INFO [train.py:534] (3/4) Epoch 417, validation: discriminator_loss=2.49, discriminator_real_loss=1.219, discriminator_fake_loss=1.271, generator_loss=29.83, generator_mel_loss=20.32, generator_kl_loss=1.961, generator_dur_loss=1.487, generator_adv_loss=2.117, generator_feat_match_loss=3.94, over 100.00 samples. 2024-02-23 10:04:32,373 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 10:07:04,896 INFO [train.py:845] (3/4) Start epoch 418 2024-02-23 10:09:19,810 INFO [train.py:471] (3/4) Epoch 418, batch 21, global_batch_idx: 15450, batch size: 76, loss[discriminator_loss=2.305, discriminator_real_loss=1.192, discriminator_fake_loss=1.112, generator_loss=30.52, generator_mel_loss=19.86, generator_kl_loss=1.924, generator_dur_loss=1.502, generator_adv_loss=2.582, generator_feat_match_loss=4.652, over 76.00 samples.], tot_loss[discriminator_loss=2.385, discriminator_real_loss=1.212, discriminator_fake_loss=1.174, generator_loss=30.07, generator_mel_loss=19.73, generator_kl_loss=1.905, generator_dur_loss=1.495, generator_adv_loss=2.46, generator_feat_match_loss=4.475, over 1392.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 10:10:35,064 INFO [train.py:845] (3/4) Start epoch 419 2024-02-23 10:13:50,175 INFO [train.py:471] (3/4) Epoch 419, batch 34, global_batch_idx: 15500, batch size: 69, loss[discriminator_loss=2.414, discriminator_real_loss=1.146, discriminator_fake_loss=1.268, generator_loss=30.01, generator_mel_loss=19.62, generator_kl_loss=1.927, generator_dur_loss=1.504, generator_adv_loss=2.492, generator_feat_match_loss=4.473, over 69.00 samples.], tot_loss[discriminator_loss=2.373, discriminator_real_loss=1.197, discriminator_fake_loss=1.176, generator_loss=30.26, generator_mel_loss=19.79, generator_kl_loss=1.921, generator_dur_loss=1.491, generator_adv_loss=2.469, generator_feat_match_loss=4.588, over 2724.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 10:14:03,807 INFO [train.py:845] (3/4) Start epoch 420 2024-02-23 10:17:36,558 INFO [train.py:845] (3/4) Start epoch 421 2024-02-23 10:18:52,276 INFO [train.py:471] (3/4) Epoch 421, batch 10, global_batch_idx: 15550, batch size: 153, loss[discriminator_loss=2.328, discriminator_real_loss=1.161, discriminator_fake_loss=1.167, generator_loss=29.97, generator_mel_loss=19.59, generator_kl_loss=1.888, generator_dur_loss=1.441, generator_adv_loss=2.408, generator_feat_match_loss=4.645, over 153.00 samples.], tot_loss[discriminator_loss=2.44, discriminator_real_loss=1.26, discriminator_fake_loss=1.18, generator_loss=30.18, generator_mel_loss=19.73, generator_kl_loss=1.927, generator_dur_loss=1.486, generator_adv_loss=2.498, generator_feat_match_loss=4.545, over 804.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 10:21:09,841 INFO [train.py:845] (3/4) Start epoch 422 2024-02-23 10:23:25,008 INFO [train.py:471] (3/4) Epoch 422, batch 23, global_batch_idx: 15600, batch size: 153, loss[discriminator_loss=2.367, discriminator_real_loss=1.143, discriminator_fake_loss=1.225, generator_loss=30.05, generator_mel_loss=19.6, generator_kl_loss=1.921, generator_dur_loss=1.441, generator_adv_loss=2.49, generator_feat_match_loss=4.605, over 153.00 samples.], tot_loss[discriminator_loss=2.373, discriminator_real_loss=1.187, discriminator_fake_loss=1.186, generator_loss=30.45, generator_mel_loss=19.69, generator_kl_loss=1.92, generator_dur_loss=1.487, generator_adv_loss=2.551, generator_feat_match_loss=4.8, over 1794.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 10:23:25,009 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 10:23:33,451 INFO [train.py:534] (3/4) Epoch 422, validation: discriminator_loss=2.432, discriminator_real_loss=1.182, discriminator_fake_loss=1.25, generator_loss=30.68, generator_mel_loss=20.25, generator_kl_loss=1.997, generator_dur_loss=1.483, generator_adv_loss=2.371, generator_feat_match_loss=4.581, over 100.00 samples. 2024-02-23 10:23:33,452 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 10:24:42,627 INFO [train.py:845] (3/4) Start epoch 423 2024-02-23 10:28:14,001 INFO [train.py:471] (3/4) Epoch 423, batch 36, global_batch_idx: 15650, batch size: 53, loss[discriminator_loss=2.34, discriminator_real_loss=1.179, discriminator_fake_loss=1.161, generator_loss=29.44, generator_mel_loss=19.3, generator_kl_loss=1.843, generator_dur_loss=1.502, generator_adv_loss=2.459, generator_feat_match_loss=4.336, over 53.00 samples.], tot_loss[discriminator_loss=2.359, discriminator_real_loss=1.185, discriminator_fake_loss=1.174, generator_loss=29.91, generator_mel_loss=19.51, generator_kl_loss=1.926, generator_dur_loss=1.494, generator_adv_loss=2.462, generator_feat_match_loss=4.512, over 2596.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 10:28:14,458 INFO [train.py:845] (3/4) Start epoch 424 2024-02-23 10:31:29,233 INFO [train.py:845] (3/4) Start epoch 425 2024-02-23 10:32:42,770 INFO [train.py:471] (3/4) Epoch 425, batch 12, global_batch_idx: 15700, batch size: 55, loss[discriminator_loss=2.367, discriminator_real_loss=1.277, discriminator_fake_loss=1.09, generator_loss=30.82, generator_mel_loss=20.29, generator_kl_loss=2.012, generator_dur_loss=1.523, generator_adv_loss=2.504, generator_feat_match_loss=4.488, over 55.00 samples.], tot_loss[discriminator_loss=2.359, discriminator_real_loss=1.198, discriminator_fake_loss=1.161, generator_loss=30.28, generator_mel_loss=19.8, generator_kl_loss=1.96, generator_dur_loss=1.492, generator_adv_loss=2.473, generator_feat_match_loss=4.555, over 1005.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 10:34:55,687 INFO [train.py:845] (3/4) Start epoch 426 2024-02-23 10:37:16,071 INFO [train.py:471] (3/4) Epoch 426, batch 25, global_batch_idx: 15750, batch size: 81, loss[discriminator_loss=2.523, discriminator_real_loss=1.403, discriminator_fake_loss=1.12, generator_loss=30.44, generator_mel_loss=19.67, generator_kl_loss=1.991, generator_dur_loss=1.496, generator_adv_loss=2.559, generator_feat_match_loss=4.727, over 81.00 samples.], tot_loss[discriminator_loss=2.346, discriminator_real_loss=1.195, discriminator_fake_loss=1.151, generator_loss=30.03, generator_mel_loss=19.51, generator_kl_loss=1.94, generator_dur_loss=1.494, generator_adv_loss=2.497, generator_feat_match_loss=4.597, over 1756.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 32.0 2024-02-23 10:38:24,167 INFO [train.py:845] (3/4) Start epoch 427 2024-02-23 10:41:53,442 INFO [train.py:845] (3/4) Start epoch 428 2024-02-23 10:42:13,058 INFO [train.py:471] (3/4) Epoch 428, batch 1, global_batch_idx: 15800, batch size: 90, loss[discriminator_loss=2.574, discriminator_real_loss=1.259, discriminator_fake_loss=1.316, generator_loss=28.9, generator_mel_loss=19.57, generator_kl_loss=1.761, generator_dur_loss=1.475, generator_adv_loss=2.432, generator_feat_match_loss=3.664, over 90.00 samples.], tot_loss[discriminator_loss=2.519, discriminator_real_loss=1.195, discriminator_fake_loss=1.324, generator_loss=29.04, generator_mel_loss=19.45, generator_kl_loss=1.837, generator_dur_loss=1.478, generator_adv_loss=2.389, generator_feat_match_loss=3.885, over 154.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 10:42:13,059 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 10:42:21,790 INFO [train.py:534] (3/4) Epoch 428, validation: discriminator_loss=2.654, discriminator_real_loss=1.347, discriminator_fake_loss=1.306, generator_loss=29.43, generator_mel_loss=20.31, generator_kl_loss=2.034, generator_dur_loss=1.486, generator_adv_loss=2.073, generator_feat_match_loss=3.529, over 100.00 samples. 2024-02-23 10:42:21,790 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 10:45:30,150 INFO [train.py:845] (3/4) Start epoch 429 2024-02-23 10:46:56,512 INFO [train.py:471] (3/4) Epoch 429, batch 14, global_batch_idx: 15850, batch size: 60, loss[discriminator_loss=2.68, discriminator_real_loss=1.682, discriminator_fake_loss=0.9976, generator_loss=30.31, generator_mel_loss=19.92, generator_kl_loss=1.968, generator_dur_loss=1.482, generator_adv_loss=2.594, generator_feat_match_loss=4.352, over 60.00 samples.], tot_loss[discriminator_loss=2.431, discriminator_real_loss=1.27, discriminator_fake_loss=1.161, generator_loss=30.89, generator_mel_loss=19.84, generator_kl_loss=1.927, generator_dur_loss=1.495, generator_adv_loss=2.688, generator_feat_match_loss=4.94, over 987.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 10:48:59,616 INFO [train.py:845] (3/4) Start epoch 430 2024-02-23 10:51:45,040 INFO [train.py:471] (3/4) Epoch 430, batch 27, global_batch_idx: 15900, batch size: 85, loss[discriminator_loss=2.65, discriminator_real_loss=1.478, discriminator_fake_loss=1.173, generator_loss=29.59, generator_mel_loss=19.54, generator_kl_loss=2.001, generator_dur_loss=1.49, generator_adv_loss=2.377, generator_feat_match_loss=4.176, over 85.00 samples.], tot_loss[discriminator_loss=2.408, discriminator_real_loss=1.209, discriminator_fake_loss=1.198, generator_loss=29.73, generator_mel_loss=19.69, generator_kl_loss=1.948, generator_dur_loss=1.485, generator_adv_loss=2.39, generator_feat_match_loss=4.219, over 2121.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 10:52:31,980 INFO [train.py:845] (3/4) Start epoch 431 2024-02-23 10:56:02,707 INFO [train.py:845] (3/4) Start epoch 432 2024-02-23 10:56:36,517 INFO [train.py:471] (3/4) Epoch 432, batch 3, global_batch_idx: 15950, batch size: 85, loss[discriminator_loss=2.465, discriminator_real_loss=1.18, discriminator_fake_loss=1.285, generator_loss=30.02, generator_mel_loss=19.86, generator_kl_loss=1.867, generator_dur_loss=1.491, generator_adv_loss=2.367, generator_feat_match_loss=4.434, over 85.00 samples.], tot_loss[discriminator_loss=2.438, discriminator_real_loss=1.204, discriminator_fake_loss=1.234, generator_loss=30.15, generator_mel_loss=19.98, generator_kl_loss=1.9, generator_dur_loss=1.477, generator_adv_loss=2.384, generator_feat_match_loss=4.412, over 377.00 samples.], cur_lr_g: 1.90e-04, cur_lr_d: 1.90e-04, grad_scale: 16.0 2024-02-23 10:59:37,796 INFO [train.py:845] (3/4) Start epoch 433 2024-02-23 11:01:10,704 INFO [train.py:471] (3/4) Epoch 433, batch 16, global_batch_idx: 16000, batch size: 55, loss[discriminator_loss=2.422, discriminator_real_loss=1.243, discriminator_fake_loss=1.18, generator_loss=29.53, generator_mel_loss=19.53, generator_kl_loss=1.841, generator_dur_loss=1.514, generator_adv_loss=2.453, generator_feat_match_loss=4.195, over 55.00 samples.], tot_loss[discriminator_loss=2.425, discriminator_real_loss=1.261, discriminator_fake_loss=1.164, generator_loss=30.54, generator_mel_loss=19.86, generator_kl_loss=1.925, generator_dur_loss=1.491, generator_adv_loss=2.616, generator_feat_match_loss=4.647, over 1165.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 32.0 2024-02-23 11:01:10,706 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 11:01:19,431 INFO [train.py:534] (3/4) Epoch 433, validation: discriminator_loss=2.396, discriminator_real_loss=1.174, discriminator_fake_loss=1.222, generator_loss=30.56, generator_mel_loss=20.32, generator_kl_loss=2.014, generator_dur_loss=1.485, generator_adv_loss=2.381, generator_feat_match_loss=4.364, over 100.00 samples. 2024-02-23 11:01:19,432 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 11:03:13,897 INFO [train.py:845] (3/4) Start epoch 434 2024-02-23 11:06:03,054 INFO [train.py:471] (3/4) Epoch 434, batch 29, global_batch_idx: 16050, batch size: 76, loss[discriminator_loss=2.641, discriminator_real_loss=1.287, discriminator_fake_loss=1.353, generator_loss=29, generator_mel_loss=19.76, generator_kl_loss=1.922, generator_dur_loss=1.49, generator_adv_loss=2.279, generator_feat_match_loss=3.555, over 76.00 samples.], tot_loss[discriminator_loss=2.422, discriminator_real_loss=1.207, discriminator_fake_loss=1.215, generator_loss=30.04, generator_mel_loss=19.96, generator_kl_loss=1.929, generator_dur_loss=1.492, generator_adv_loss=2.387, generator_feat_match_loss=4.274, over 2075.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 11:06:41,643 INFO [train.py:845] (3/4) Start epoch 435 2024-02-23 11:10:10,175 INFO [train.py:845] (3/4) Start epoch 436 2024-02-23 11:10:48,857 INFO [train.py:471] (3/4) Epoch 436, batch 5, global_batch_idx: 16100, batch size: 65, loss[discriminator_loss=2.445, discriminator_real_loss=1.232, discriminator_fake_loss=1.213, generator_loss=29.46, generator_mel_loss=19.34, generator_kl_loss=1.834, generator_dur_loss=1.519, generator_adv_loss=2.393, generator_feat_match_loss=4.375, over 65.00 samples.], tot_loss[discriminator_loss=2.482, discriminator_real_loss=1.239, discriminator_fake_loss=1.243, generator_loss=29.84, generator_mel_loss=19.74, generator_kl_loss=1.937, generator_dur_loss=1.5, generator_adv_loss=2.46, generator_feat_match_loss=4.204, over 426.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 11:13:39,900 INFO [train.py:845] (3/4) Start epoch 437 2024-02-23 11:15:26,162 INFO [train.py:471] (3/4) Epoch 437, batch 18, global_batch_idx: 16150, batch size: 53, loss[discriminator_loss=2.461, discriminator_real_loss=1.34, discriminator_fake_loss=1.12, generator_loss=29.94, generator_mel_loss=19.41, generator_kl_loss=2.027, generator_dur_loss=1.522, generator_adv_loss=2.391, generator_feat_match_loss=4.59, over 53.00 samples.], tot_loss[discriminator_loss=2.345, discriminator_real_loss=1.195, discriminator_fake_loss=1.15, generator_loss=30.29, generator_mel_loss=19.67, generator_kl_loss=1.918, generator_dur_loss=1.49, generator_adv_loss=2.49, generator_feat_match_loss=4.714, over 1376.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 11:17:01,942 INFO [train.py:845] (3/4) Start epoch 438 2024-02-23 11:20:08,828 INFO [train.py:471] (3/4) Epoch 438, batch 31, global_batch_idx: 16200, batch size: 64, loss[discriminator_loss=2.441, discriminator_real_loss=1.242, discriminator_fake_loss=1.198, generator_loss=30.29, generator_mel_loss=19.97, generator_kl_loss=1.959, generator_dur_loss=1.494, generator_adv_loss=2.453, generator_feat_match_loss=4.414, over 64.00 samples.], tot_loss[discriminator_loss=2.431, discriminator_real_loss=1.213, discriminator_fake_loss=1.218, generator_loss=29.9, generator_mel_loss=19.61, generator_kl_loss=1.936, generator_dur_loss=1.494, generator_adv_loss=2.441, generator_feat_match_loss=4.418, over 2135.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 11:20:08,830 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 11:20:17,501 INFO [train.py:534] (3/4) Epoch 438, validation: discriminator_loss=2.267, discriminator_real_loss=1.092, discriminator_fake_loss=1.175, generator_loss=31.61, generator_mel_loss=20.93, generator_kl_loss=2.061, generator_dur_loss=1.484, generator_adv_loss=2.504, generator_feat_match_loss=4.626, over 100.00 samples. 2024-02-23 11:20:17,502 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 11:20:41,010 INFO [train.py:845] (3/4) Start epoch 439 2024-02-23 11:24:10,810 INFO [train.py:845] (3/4) Start epoch 440 2024-02-23 11:24:59,288 INFO [train.py:471] (3/4) Epoch 440, batch 7, global_batch_idx: 16250, batch size: 90, loss[discriminator_loss=2.316, discriminator_real_loss=1.153, discriminator_fake_loss=1.162, generator_loss=30.87, generator_mel_loss=20.14, generator_kl_loss=1.835, generator_dur_loss=1.476, generator_adv_loss=2.592, generator_feat_match_loss=4.824, over 90.00 samples.], tot_loss[discriminator_loss=2.439, discriminator_real_loss=1.226, discriminator_fake_loss=1.212, generator_loss=30.28, generator_mel_loss=20.11, generator_kl_loss=1.936, generator_dur_loss=1.48, generator_adv_loss=2.387, generator_feat_match_loss=4.373, over 606.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 11:27:39,615 INFO [train.py:845] (3/4) Start epoch 441 2024-02-23 11:29:42,380 INFO [train.py:471] (3/4) Epoch 441, batch 20, global_batch_idx: 16300, batch size: 90, loss[discriminator_loss=2.416, discriminator_real_loss=1.172, discriminator_fake_loss=1.244, generator_loss=29.88, generator_mel_loss=19.68, generator_kl_loss=1.887, generator_dur_loss=1.474, generator_adv_loss=2.334, generator_feat_match_loss=4.512, over 90.00 samples.], tot_loss[discriminator_loss=2.344, discriminator_real_loss=1.185, discriminator_fake_loss=1.159, generator_loss=29.98, generator_mel_loss=19.62, generator_kl_loss=1.928, generator_dur_loss=1.489, generator_adv_loss=2.443, generator_feat_match_loss=4.502, over 1527.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 11:31:02,702 INFO [train.py:845] (3/4) Start epoch 442 2024-02-23 11:34:17,102 INFO [train.py:471] (3/4) Epoch 442, batch 33, global_batch_idx: 16350, batch size: 56, loss[discriminator_loss=2.355, discriminator_real_loss=1.199, discriminator_fake_loss=1.155, generator_loss=30.47, generator_mel_loss=19.8, generator_kl_loss=2.015, generator_dur_loss=1.497, generator_adv_loss=2.48, generator_feat_match_loss=4.676, over 56.00 samples.], tot_loss[discriminator_loss=2.365, discriminator_real_loss=1.191, discriminator_fake_loss=1.174, generator_loss=30.3, generator_mel_loss=19.77, generator_kl_loss=1.943, generator_dur_loss=1.486, generator_adv_loss=2.462, generator_feat_match_loss=4.637, over 2780.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 11:34:38,715 INFO [train.py:845] (3/4) Start epoch 443 2024-02-23 11:38:07,847 INFO [train.py:845] (3/4) Start epoch 444 2024-02-23 11:39:13,893 INFO [train.py:471] (3/4) Epoch 444, batch 9, global_batch_idx: 16400, batch size: 71, loss[discriminator_loss=2.402, discriminator_real_loss=1.187, discriminator_fake_loss=1.215, generator_loss=29.86, generator_mel_loss=19.64, generator_kl_loss=1.902, generator_dur_loss=1.489, generator_adv_loss=2.379, generator_feat_match_loss=4.453, over 71.00 samples.], tot_loss[discriminator_loss=2.372, discriminator_real_loss=1.203, discriminator_fake_loss=1.169, generator_loss=30.14, generator_mel_loss=19.6, generator_kl_loss=1.947, generator_dur_loss=1.498, generator_adv_loss=2.454, generator_feat_match_loss=4.643, over 681.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 11:39:13,895 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 11:39:22,473 INFO [train.py:534] (3/4) Epoch 444, validation: discriminator_loss=2.351, discriminator_real_loss=1.143, discriminator_fake_loss=1.208, generator_loss=30.85, generator_mel_loss=20.45, generator_kl_loss=2.066, generator_dur_loss=1.479, generator_adv_loss=2.282, generator_feat_match_loss=4.579, over 100.00 samples. 2024-02-23 11:39:22,474 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 11:41:47,484 INFO [train.py:845] (3/4) Start epoch 445 2024-02-23 11:43:57,593 INFO [train.py:471] (3/4) Epoch 445, batch 22, global_batch_idx: 16450, batch size: 110, loss[discriminator_loss=2.332, discriminator_real_loss=1.16, discriminator_fake_loss=1.173, generator_loss=30.85, generator_mel_loss=20.24, generator_kl_loss=1.932, generator_dur_loss=1.502, generator_adv_loss=2.484, generator_feat_match_loss=4.691, over 110.00 samples.], tot_loss[discriminator_loss=2.368, discriminator_real_loss=1.202, discriminator_fake_loss=1.167, generator_loss=30.18, generator_mel_loss=19.67, generator_kl_loss=1.939, generator_dur_loss=1.496, generator_adv_loss=2.465, generator_feat_match_loss=4.612, over 1703.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 11:45:12,860 INFO [train.py:845] (3/4) Start epoch 446 2024-02-23 11:48:37,589 INFO [train.py:471] (3/4) Epoch 446, batch 35, global_batch_idx: 16500, batch size: 95, loss[discriminator_loss=2.344, discriminator_real_loss=0.9702, discriminator_fake_loss=1.373, generator_loss=30.18, generator_mel_loss=19.63, generator_kl_loss=1.885, generator_dur_loss=1.475, generator_adv_loss=2.594, generator_feat_match_loss=4.602, over 95.00 samples.], tot_loss[discriminator_loss=2.424, discriminator_real_loss=1.223, discriminator_fake_loss=1.201, generator_loss=30.27, generator_mel_loss=19.76, generator_kl_loss=1.941, generator_dur_loss=1.488, generator_adv_loss=2.477, generator_feat_match_loss=4.601, over 2706.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 11:48:43,140 INFO [train.py:845] (3/4) Start epoch 447 2024-02-23 11:52:04,797 INFO [train.py:845] (3/4) Start epoch 448 2024-02-23 11:53:19,808 INFO [train.py:471] (3/4) Epoch 448, batch 11, global_batch_idx: 16550, batch size: 90, loss[discriminator_loss=2.43, discriminator_real_loss=1.229, discriminator_fake_loss=1.202, generator_loss=28.92, generator_mel_loss=19.09, generator_kl_loss=1.952, generator_dur_loss=1.456, generator_adv_loss=2.383, generator_feat_match_loss=4.039, over 90.00 samples.], tot_loss[discriminator_loss=2.464, discriminator_real_loss=1.257, discriminator_fake_loss=1.206, generator_loss=29.32, generator_mel_loss=19.38, generator_kl_loss=1.896, generator_dur_loss=1.486, generator_adv_loss=2.389, generator_feat_match_loss=4.171, over 1011.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 11:55:35,594 INFO [train.py:845] (3/4) Start epoch 449 2024-02-23 11:58:01,135 INFO [train.py:471] (3/4) Epoch 449, batch 24, global_batch_idx: 16600, batch size: 56, loss[discriminator_loss=2.145, discriminator_real_loss=1.059, discriminator_fake_loss=1.087, generator_loss=30.06, generator_mel_loss=18.93, generator_kl_loss=1.86, generator_dur_loss=1.485, generator_adv_loss=2.621, generator_feat_match_loss=5.16, over 56.00 samples.], tot_loss[discriminator_loss=2.401, discriminator_real_loss=1.208, discriminator_fake_loss=1.194, generator_loss=29.73, generator_mel_loss=19.55, generator_kl_loss=1.928, generator_dur_loss=1.492, generator_adv_loss=2.396, generator_feat_match_loss=4.364, over 1647.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 11:58:01,137 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 11:58:09,255 INFO [train.py:534] (3/4) Epoch 449, validation: discriminator_loss=2.159, discriminator_real_loss=0.8856, discriminator_fake_loss=1.273, generator_loss=32.42, generator_mel_loss=20.98, generator_kl_loss=2.136, generator_dur_loss=1.489, generator_adv_loss=2.452, generator_feat_match_loss=5.358, over 100.00 samples. 2024-02-23 11:58:09,256 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 11:59:14,124 INFO [train.py:845] (3/4) Start epoch 450 2024-02-23 12:02:45,434 INFO [train.py:845] (3/4) Start epoch 451 2024-02-23 12:02:59,473 INFO [train.py:471] (3/4) Epoch 451, batch 0, global_batch_idx: 16650, batch size: 52, loss[discriminator_loss=2.273, discriminator_real_loss=1.109, discriminator_fake_loss=1.164, generator_loss=29.6, generator_mel_loss=19.14, generator_kl_loss=1.89, generator_dur_loss=1.506, generator_adv_loss=2.43, generator_feat_match_loss=4.637, over 52.00 samples.], tot_loss[discriminator_loss=2.273, discriminator_real_loss=1.109, discriminator_fake_loss=1.164, generator_loss=29.6, generator_mel_loss=19.14, generator_kl_loss=1.89, generator_dur_loss=1.506, generator_adv_loss=2.43, generator_feat_match_loss=4.637, over 52.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 12:06:14,614 INFO [train.py:845] (3/4) Start epoch 452 2024-02-23 12:07:34,495 INFO [train.py:471] (3/4) Epoch 452, batch 13, global_batch_idx: 16700, batch size: 73, loss[discriminator_loss=2.666, discriminator_real_loss=1.567, discriminator_fake_loss=1.099, generator_loss=29.86, generator_mel_loss=19.73, generator_kl_loss=1.923, generator_dur_loss=1.451, generator_adv_loss=2.316, generator_feat_match_loss=4.445, over 73.00 samples.], tot_loss[discriminator_loss=2.311, discriminator_real_loss=1.173, discriminator_fake_loss=1.138, generator_loss=30.76, generator_mel_loss=19.67, generator_kl_loss=1.941, generator_dur_loss=1.478, generator_adv_loss=2.616, generator_feat_match_loss=5.056, over 1039.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 12:09:36,398 INFO [train.py:845] (3/4) Start epoch 453 2024-02-23 12:12:04,321 INFO [train.py:471] (3/4) Epoch 453, batch 26, global_batch_idx: 16750, batch size: 81, loss[discriminator_loss=2.373, discriminator_real_loss=1.168, discriminator_fake_loss=1.205, generator_loss=30.13, generator_mel_loss=19.98, generator_kl_loss=1.939, generator_dur_loss=1.494, generator_adv_loss=2.377, generator_feat_match_loss=4.348, over 81.00 samples.], tot_loss[discriminator_loss=2.385, discriminator_real_loss=1.199, discriminator_fake_loss=1.186, generator_loss=29.56, generator_mel_loss=19.43, generator_kl_loss=1.934, generator_dur_loss=1.491, generator_adv_loss=2.403, generator_feat_match_loss=4.302, over 1915.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 12:12:59,835 INFO [train.py:845] (3/4) Start epoch 454 2024-02-23 12:16:26,971 INFO [train.py:845] (3/4) Start epoch 455 2024-02-23 12:16:48,775 INFO [train.py:471] (3/4) Epoch 455, batch 2, global_batch_idx: 16800, batch size: 85, loss[discriminator_loss=2.406, discriminator_real_loss=1.184, discriminator_fake_loss=1.223, generator_loss=29.27, generator_mel_loss=19.31, generator_kl_loss=1.938, generator_dur_loss=1.496, generator_adv_loss=2.375, generator_feat_match_loss=4.152, over 85.00 samples.], tot_loss[discriminator_loss=2.416, discriminator_real_loss=1.174, discriminator_fake_loss=1.242, generator_loss=29.13, generator_mel_loss=19.29, generator_kl_loss=1.889, generator_dur_loss=1.507, generator_adv_loss=2.362, generator_feat_match_loss=4.087, over 201.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 32.0 2024-02-23 12:16:48,775 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 12:16:58,009 INFO [train.py:534] (3/4) Epoch 455, validation: discriminator_loss=2.396, discriminator_real_loss=1.176, discriminator_fake_loss=1.221, generator_loss=30.19, generator_mel_loss=20.07, generator_kl_loss=2.061, generator_dur_loss=1.482, generator_adv_loss=2.325, generator_feat_match_loss=4.252, over 100.00 samples. 2024-02-23 12:16:58,009 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 12:20:03,315 INFO [train.py:845] (3/4) Start epoch 456 2024-02-23 12:21:42,188 INFO [train.py:471] (3/4) Epoch 456, batch 15, global_batch_idx: 16850, batch size: 71, loss[discriminator_loss=2.527, discriminator_real_loss=1.19, discriminator_fake_loss=1.338, generator_loss=29.93, generator_mel_loss=19.86, generator_kl_loss=1.93, generator_dur_loss=1.476, generator_adv_loss=2.588, generator_feat_match_loss=4.078, over 71.00 samples.], tot_loss[discriminator_loss=2.532, discriminator_real_loss=1.223, discriminator_fake_loss=1.309, generator_loss=30.15, generator_mel_loss=19.88, generator_kl_loss=1.918, generator_dur_loss=1.482, generator_adv_loss=2.465, generator_feat_match_loss=4.407, over 1211.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 12:23:36,296 INFO [train.py:845] (3/4) Start epoch 457 2024-02-23 12:26:08,958 INFO [train.py:471] (3/4) Epoch 457, batch 28, global_batch_idx: 16900, batch size: 76, loss[discriminator_loss=2.406, discriminator_real_loss=1.185, discriminator_fake_loss=1.221, generator_loss=29.95, generator_mel_loss=19.79, generator_kl_loss=1.956, generator_dur_loss=1.487, generator_adv_loss=2.42, generator_feat_match_loss=4.301, over 76.00 samples.], tot_loss[discriminator_loss=2.422, discriminator_real_loss=1.212, discriminator_fake_loss=1.21, generator_loss=29.94, generator_mel_loss=19.7, generator_kl_loss=1.923, generator_dur_loss=1.49, generator_adv_loss=2.464, generator_feat_match_loss=4.36, over 1949.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 12:26:59,590 INFO [train.py:845] (3/4) Start epoch 458 2024-02-23 12:30:23,426 INFO [train.py:845] (3/4) Start epoch 459 2024-02-23 12:31:00,002 INFO [train.py:471] (3/4) Epoch 459, batch 4, global_batch_idx: 16950, batch size: 65, loss[discriminator_loss=2.41, discriminator_real_loss=1.224, discriminator_fake_loss=1.188, generator_loss=30.73, generator_mel_loss=20.16, generator_kl_loss=2.029, generator_dur_loss=1.492, generator_adv_loss=2.459, generator_feat_match_loss=4.586, over 65.00 samples.], tot_loss[discriminator_loss=2.451, discriminator_real_loss=1.232, discriminator_fake_loss=1.219, generator_loss=29.87, generator_mel_loss=19.8, generator_kl_loss=2.001, generator_dur_loss=1.501, generator_adv_loss=2.403, generator_feat_match_loss=4.157, over 324.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 12:33:53,352 INFO [train.py:845] (3/4) Start epoch 460 2024-02-23 12:35:39,832 INFO [train.py:471] (3/4) Epoch 460, batch 17, global_batch_idx: 17000, batch size: 153, loss[discriminator_loss=2.545, discriminator_real_loss=1.296, discriminator_fake_loss=1.249, generator_loss=30.38, generator_mel_loss=20.1, generator_kl_loss=1.952, generator_dur_loss=1.439, generator_adv_loss=2.297, generator_feat_match_loss=4.594, over 153.00 samples.], tot_loss[discriminator_loss=2.553, discriminator_real_loss=1.28, discriminator_fake_loss=1.273, generator_loss=29.93, generator_mel_loss=19.94, generator_kl_loss=1.944, generator_dur_loss=1.493, generator_adv_loss=2.378, generator_feat_match_loss=4.168, over 1309.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 12:35:39,834 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 12:35:47,795 INFO [train.py:534] (3/4) Epoch 460, validation: discriminator_loss=2.368, discriminator_real_loss=1.151, discriminator_fake_loss=1.218, generator_loss=30.94, generator_mel_loss=20.58, generator_kl_loss=2.024, generator_dur_loss=1.486, generator_adv_loss=2.274, generator_feat_match_loss=4.576, over 100.00 samples. 2024-02-23 12:35:47,796 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 12:37:37,307 INFO [train.py:845] (3/4) Start epoch 461 2024-02-23 12:40:33,625 INFO [train.py:471] (3/4) Epoch 461, batch 30, global_batch_idx: 17050, batch size: 82, loss[discriminator_loss=2.312, discriminator_real_loss=1.105, discriminator_fake_loss=1.206, generator_loss=29.61, generator_mel_loss=19.32, generator_kl_loss=1.995, generator_dur_loss=1.478, generator_adv_loss=2.363, generator_feat_match_loss=4.453, over 82.00 samples.], tot_loss[discriminator_loss=2.479, discriminator_real_loss=1.242, discriminator_fake_loss=1.237, generator_loss=29.56, generator_mel_loss=19.63, generator_kl_loss=1.927, generator_dur_loss=1.488, generator_adv_loss=2.36, generator_feat_match_loss=4.153, over 2369.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 12:41:07,109 INFO [train.py:845] (3/4) Start epoch 462 2024-02-23 12:44:33,282 INFO [train.py:845] (3/4) Start epoch 463 2024-02-23 12:45:22,807 INFO [train.py:471] (3/4) Epoch 463, batch 6, global_batch_idx: 17100, batch size: 71, loss[discriminator_loss=2.352, discriminator_real_loss=1.25, discriminator_fake_loss=1.101, generator_loss=30.18, generator_mel_loss=19.85, generator_kl_loss=1.909, generator_dur_loss=1.478, generator_adv_loss=2.451, generator_feat_match_loss=4.492, over 71.00 samples.], tot_loss[discriminator_loss=2.502, discriminator_real_loss=1.21, discriminator_fake_loss=1.291, generator_loss=29.44, generator_mel_loss=19.63, generator_kl_loss=1.913, generator_dur_loss=1.481, generator_adv_loss=2.31, generator_feat_match_loss=4.107, over 563.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 12:48:02,687 INFO [train.py:845] (3/4) Start epoch 464 2024-02-23 12:49:55,169 INFO [train.py:471] (3/4) Epoch 464, batch 19, global_batch_idx: 17150, batch size: 71, loss[discriminator_loss=2.336, discriminator_real_loss=1.232, discriminator_fake_loss=1.104, generator_loss=29.79, generator_mel_loss=19.57, generator_kl_loss=1.912, generator_dur_loss=1.486, generator_adv_loss=2.412, generator_feat_match_loss=4.414, over 71.00 samples.], tot_loss[discriminator_loss=2.502, discriminator_real_loss=1.273, discriminator_fake_loss=1.229, generator_loss=29.43, generator_mel_loss=19.67, generator_kl_loss=1.918, generator_dur_loss=1.494, generator_adv_loss=2.327, generator_feat_match_loss=4.024, over 1331.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 12:51:27,769 INFO [train.py:845] (3/4) Start epoch 465 2024-02-23 12:54:34,612 INFO [train.py:471] (3/4) Epoch 465, batch 32, global_batch_idx: 17200, batch size: 126, loss[discriminator_loss=2.438, discriminator_real_loss=1.195, discriminator_fake_loss=1.242, generator_loss=29.86, generator_mel_loss=19.97, generator_kl_loss=1.938, generator_dur_loss=1.47, generator_adv_loss=2.27, generator_feat_match_loss=4.211, over 126.00 samples.], tot_loss[discriminator_loss=2.527, discriminator_real_loss=1.263, discriminator_fake_loss=1.263, generator_loss=29.87, generator_mel_loss=19.98, generator_kl_loss=1.917, generator_dur_loss=1.485, generator_adv_loss=2.321, generator_feat_match_loss=4.166, over 2540.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 16.0 2024-02-23 12:54:34,614 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 12:54:43,387 INFO [train.py:534] (3/4) Epoch 465, validation: discriminator_loss=2.732, discriminator_real_loss=1.186, discriminator_fake_loss=1.546, generator_loss=28.91, generator_mel_loss=20.55, generator_kl_loss=2.018, generator_dur_loss=1.491, generator_adv_loss=1.628, generator_feat_match_loss=3.224, over 100.00 samples. 2024-02-23 12:54:43,389 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 12:55:05,346 INFO [train.py:845] (3/4) Start epoch 466 2024-02-23 12:58:32,026 INFO [train.py:845] (3/4) Start epoch 467 2024-02-23 12:59:26,065 INFO [train.py:471] (3/4) Epoch 467, batch 8, global_batch_idx: 17250, batch size: 60, loss[discriminator_loss=2.5, discriminator_real_loss=1.27, discriminator_fake_loss=1.229, generator_loss=29.61, generator_mel_loss=20.15, generator_kl_loss=1.939, generator_dur_loss=1.5, generator_adv_loss=2.471, generator_feat_match_loss=3.553, over 60.00 samples.], tot_loss[discriminator_loss=2.489, discriminator_real_loss=1.216, discriminator_fake_loss=1.273, generator_loss=30.51, generator_mel_loss=20.43, generator_kl_loss=1.927, generator_dur_loss=1.489, generator_adv_loss=2.384, generator_feat_match_loss=4.273, over 620.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 13:01:58,331 INFO [train.py:845] (3/4) Start epoch 468 2024-02-23 13:04:11,066 INFO [train.py:471] (3/4) Epoch 468, batch 21, global_batch_idx: 17300, batch size: 69, loss[discriminator_loss=2.301, discriminator_real_loss=1.224, discriminator_fake_loss=1.078, generator_loss=30.96, generator_mel_loss=20.17, generator_kl_loss=1.913, generator_dur_loss=1.486, generator_adv_loss=2.549, generator_feat_match_loss=4.844, over 69.00 samples.], tot_loss[discriminator_loss=2.451, discriminator_real_loss=1.252, discriminator_fake_loss=1.199, generator_loss=29.97, generator_mel_loss=19.77, generator_kl_loss=1.924, generator_dur_loss=1.481, generator_adv_loss=2.432, generator_feat_match_loss=4.361, over 1790.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 13:05:31,892 INFO [train.py:845] (3/4) Start epoch 469 2024-02-23 13:08:54,675 INFO [train.py:471] (3/4) Epoch 469, batch 34, global_batch_idx: 17350, batch size: 65, loss[discriminator_loss=2.357, discriminator_real_loss=1.332, discriminator_fake_loss=1.025, generator_loss=30.75, generator_mel_loss=19.79, generator_kl_loss=1.93, generator_dur_loss=1.49, generator_adv_loss=2.572, generator_feat_match_loss=4.969, over 65.00 samples.], tot_loss[discriminator_loss=2.413, discriminator_real_loss=1.214, discriminator_fake_loss=1.199, generator_loss=30.06, generator_mel_loss=19.77, generator_kl_loss=1.912, generator_dur_loss=1.489, generator_adv_loss=2.422, generator_feat_match_loss=4.475, over 2519.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 13:09:03,303 INFO [train.py:845] (3/4) Start epoch 470 2024-02-23 13:12:32,257 INFO [train.py:845] (3/4) Start epoch 471 2024-02-23 13:13:46,840 INFO [train.py:471] (3/4) Epoch 471, batch 10, global_batch_idx: 17400, batch size: 59, loss[discriminator_loss=2.348, discriminator_real_loss=1.166, discriminator_fake_loss=1.183, generator_loss=30.36, generator_mel_loss=19.77, generator_kl_loss=1.982, generator_dur_loss=1.505, generator_adv_loss=2.406, generator_feat_match_loss=4.688, over 59.00 samples.], tot_loss[discriminator_loss=2.401, discriminator_real_loss=1.188, discriminator_fake_loss=1.213, generator_loss=29.85, generator_mel_loss=19.62, generator_kl_loss=1.931, generator_dur_loss=1.485, generator_adv_loss=2.408, generator_feat_match_loss=4.411, over 853.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 13:13:46,841 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 13:13:55,061 INFO [train.py:534] (3/4) Epoch 471, validation: discriminator_loss=2.327, discriminator_real_loss=1.113, discriminator_fake_loss=1.214, generator_loss=30.3, generator_mel_loss=20.08, generator_kl_loss=1.996, generator_dur_loss=1.484, generator_adv_loss=2.293, generator_feat_match_loss=4.451, over 100.00 samples. 2024-02-23 13:13:55,062 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 13:16:14,559 INFO [train.py:845] (3/4) Start epoch 472 2024-02-23 13:18:27,709 INFO [train.py:471] (3/4) Epoch 472, batch 23, global_batch_idx: 17450, batch size: 50, loss[discriminator_loss=2.344, discriminator_real_loss=1.132, discriminator_fake_loss=1.212, generator_loss=30, generator_mel_loss=19.54, generator_kl_loss=1.984, generator_dur_loss=1.476, generator_adv_loss=2.381, generator_feat_match_loss=4.617, over 50.00 samples.], tot_loss[discriminator_loss=2.438, discriminator_real_loss=1.236, discriminator_fake_loss=1.202, generator_loss=30.07, generator_mel_loss=19.82, generator_kl_loss=1.945, generator_dur_loss=1.489, generator_adv_loss=2.405, generator_feat_match_loss=4.405, over 1602.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 13:19:48,245 INFO [train.py:845] (3/4) Start epoch 473 2024-02-23 13:23:16,858 INFO [train.py:471] (3/4) Epoch 473, batch 36, global_batch_idx: 17500, batch size: 69, loss[discriminator_loss=2.352, discriminator_real_loss=1.213, discriminator_fake_loss=1.139, generator_loss=29.49, generator_mel_loss=19.35, generator_kl_loss=1.838, generator_dur_loss=1.475, generator_adv_loss=2.375, generator_feat_match_loss=4.453, over 69.00 samples.], tot_loss[discriminator_loss=2.494, discriminator_real_loss=1.25, discriminator_fake_loss=1.244, generator_loss=29.68, generator_mel_loss=19.64, generator_kl_loss=1.931, generator_dur_loss=1.487, generator_adv_loss=2.373, generator_feat_match_loss=4.249, over 2555.00 samples.], cur_lr_g: 1.89e-04, cur_lr_d: 1.89e-04, grad_scale: 8.0 2024-02-23 13:23:17,316 INFO [train.py:845] (3/4) Start epoch 474 2024-02-23 13:26:39,986 INFO [train.py:845] (3/4) Start epoch 475 2024-02-23 13:28:02,468 INFO [train.py:471] (3/4) Epoch 475, batch 12, global_batch_idx: 17550, batch size: 67, loss[discriminator_loss=2.467, discriminator_real_loss=1.279, discriminator_fake_loss=1.188, generator_loss=29.78, generator_mel_loss=19.56, generator_kl_loss=1.909, generator_dur_loss=1.512, generator_adv_loss=2.486, generator_feat_match_loss=4.316, over 67.00 samples.], tot_loss[discriminator_loss=2.421, discriminator_real_loss=1.196, discriminator_fake_loss=1.225, generator_loss=29.85, generator_mel_loss=19.61, generator_kl_loss=1.923, generator_dur_loss=1.49, generator_adv_loss=2.356, generator_feat_match_loss=4.476, over 975.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 13:30:09,660 INFO [train.py:845] (3/4) Start epoch 476 2024-02-23 13:32:46,146 INFO [train.py:471] (3/4) Epoch 476, batch 25, global_batch_idx: 17600, batch size: 153, loss[discriminator_loss=2.25, discriminator_real_loss=1.132, discriminator_fake_loss=1.117, generator_loss=31.55, generator_mel_loss=20.24, generator_kl_loss=2.017, generator_dur_loss=1.483, generator_adv_loss=2.539, generator_feat_match_loss=5.277, over 153.00 samples.], tot_loss[discriminator_loss=2.401, discriminator_real_loss=1.207, discriminator_fake_loss=1.194, generator_loss=30.24, generator_mel_loss=19.79, generator_kl_loss=1.944, generator_dur_loss=1.486, generator_adv_loss=2.434, generator_feat_match_loss=4.59, over 1927.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 13:32:46,148 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 13:32:54,160 INFO [train.py:534] (3/4) Epoch 476, validation: discriminator_loss=2.252, discriminator_real_loss=1.093, discriminator_fake_loss=1.159, generator_loss=31.39, generator_mel_loss=20.54, generator_kl_loss=2, generator_dur_loss=1.48, generator_adv_loss=2.423, generator_feat_match_loss=4.952, over 100.00 samples. 2024-02-23 13:32:54,161 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 13:33:47,088 INFO [train.py:845] (3/4) Start epoch 477 2024-02-23 13:37:18,433 INFO [train.py:845] (3/4) Start epoch 478 2024-02-23 13:37:38,779 INFO [train.py:471] (3/4) Epoch 478, batch 1, global_batch_idx: 17650, batch size: 61, loss[discriminator_loss=2.34, discriminator_real_loss=1.263, discriminator_fake_loss=1.076, generator_loss=30.17, generator_mel_loss=19.51, generator_kl_loss=1.914, generator_dur_loss=1.485, generator_adv_loss=2.553, generator_feat_match_loss=4.715, over 61.00 samples.], tot_loss[discriminator_loss=2.371, discriminator_real_loss=1.238, discriminator_fake_loss=1.132, generator_loss=30.27, generator_mel_loss=19.69, generator_kl_loss=1.939, generator_dur_loss=1.478, generator_adv_loss=2.53, generator_feat_match_loss=4.639, over 187.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 13:40:52,063 INFO [train.py:845] (3/4) Start epoch 479 2024-02-23 13:42:22,385 INFO [train.py:471] (3/4) Epoch 479, batch 14, global_batch_idx: 17700, batch size: 50, loss[discriminator_loss=2.652, discriminator_real_loss=1.381, discriminator_fake_loss=1.271, generator_loss=29.39, generator_mel_loss=19.04, generator_kl_loss=1.922, generator_dur_loss=1.51, generator_adv_loss=2.523, generator_feat_match_loss=4.395, over 50.00 samples.], tot_loss[discriminator_loss=2.375, discriminator_real_loss=1.215, discriminator_fake_loss=1.16, generator_loss=30.37, generator_mel_loss=19.62, generator_kl_loss=1.924, generator_dur_loss=1.488, generator_adv_loss=2.577, generator_feat_match_loss=4.755, over 1028.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 13:44:22,197 INFO [train.py:845] (3/4) Start epoch 480 2024-02-23 13:46:59,047 INFO [train.py:471] (3/4) Epoch 480, batch 27, global_batch_idx: 17750, batch size: 53, loss[discriminator_loss=2.25, discriminator_real_loss=1.129, discriminator_fake_loss=1.122, generator_loss=30.7, generator_mel_loss=19.8, generator_kl_loss=1.939, generator_dur_loss=1.497, generator_adv_loss=2.479, generator_feat_match_loss=4.984, over 53.00 samples.], tot_loss[discriminator_loss=2.336, discriminator_real_loss=1.176, discriminator_fake_loss=1.159, generator_loss=30.27, generator_mel_loss=19.51, generator_kl_loss=1.93, generator_dur_loss=1.487, generator_adv_loss=2.507, generator_feat_match_loss=4.834, over 2023.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 13:47:47,822 INFO [train.py:845] (3/4) Start epoch 481 2024-02-23 13:51:14,703 INFO [train.py:845] (3/4) Start epoch 482 2024-02-23 13:51:42,143 INFO [train.py:471] (3/4) Epoch 482, batch 3, global_batch_idx: 17800, batch size: 76, loss[discriminator_loss=2.445, discriminator_real_loss=1.237, discriminator_fake_loss=1.208, generator_loss=29.73, generator_mel_loss=19.92, generator_kl_loss=1.954, generator_dur_loss=1.498, generator_adv_loss=2.617, generator_feat_match_loss=3.742, over 76.00 samples.], tot_loss[discriminator_loss=2.516, discriminator_real_loss=1.197, discriminator_fake_loss=1.319, generator_loss=30.17, generator_mel_loss=20.09, generator_kl_loss=1.916, generator_dur_loss=1.502, generator_adv_loss=2.453, generator_feat_match_loss=4.206, over 250.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 13:51:42,144 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 13:51:50,593 INFO [train.py:534] (3/4) Epoch 482, validation: discriminator_loss=2.612, discriminator_real_loss=1.364, discriminator_fake_loss=1.248, generator_loss=29.95, generator_mel_loss=20.63, generator_kl_loss=2.013, generator_dur_loss=1.479, generator_adv_loss=2.3, generator_feat_match_loss=3.527, over 100.00 samples. 2024-02-23 13:51:50,594 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 13:54:52,949 INFO [train.py:845] (3/4) Start epoch 483 2024-02-23 13:56:33,893 INFO [train.py:471] (3/4) Epoch 483, batch 16, global_batch_idx: 17850, batch size: 55, loss[discriminator_loss=2.311, discriminator_real_loss=1.269, discriminator_fake_loss=1.042, generator_loss=30.08, generator_mel_loss=19.39, generator_kl_loss=1.943, generator_dur_loss=1.523, generator_adv_loss=2.424, generator_feat_match_loss=4.801, over 55.00 samples.], tot_loss[discriminator_loss=2.512, discriminator_real_loss=1.278, discriminator_fake_loss=1.234, generator_loss=29.87, generator_mel_loss=20.02, generator_kl_loss=1.956, generator_dur_loss=1.493, generator_adv_loss=2.324, generator_feat_match_loss=4.08, over 1153.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 13:58:24,883 INFO [train.py:845] (3/4) Start epoch 484 2024-02-23 14:01:02,909 INFO [train.py:471] (3/4) Epoch 484, batch 29, global_batch_idx: 17900, batch size: 50, loss[discriminator_loss=2.418, discriminator_real_loss=1.316, discriminator_fake_loss=1.103, generator_loss=30.26, generator_mel_loss=20.14, generator_kl_loss=1.842, generator_dur_loss=1.505, generator_adv_loss=2.322, generator_feat_match_loss=4.445, over 50.00 samples.], tot_loss[discriminator_loss=2.511, discriminator_real_loss=1.264, discriminator_fake_loss=1.246, generator_loss=29.69, generator_mel_loss=19.69, generator_kl_loss=1.928, generator_dur_loss=1.483, generator_adv_loss=2.355, generator_feat_match_loss=4.234, over 2255.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 14:01:47,100 INFO [train.py:845] (3/4) Start epoch 485 2024-02-23 14:05:22,231 INFO [train.py:845] (3/4) Start epoch 486 2024-02-23 14:05:58,012 INFO [train.py:471] (3/4) Epoch 486, batch 5, global_batch_idx: 17950, batch size: 95, loss[discriminator_loss=2.531, discriminator_real_loss=1.504, discriminator_fake_loss=1.026, generator_loss=30.38, generator_mel_loss=20.11, generator_kl_loss=2.006, generator_dur_loss=1.464, generator_adv_loss=2.486, generator_feat_match_loss=4.316, over 95.00 samples.], tot_loss[discriminator_loss=2.506, discriminator_real_loss=1.249, discriminator_fake_loss=1.257, generator_loss=30.11, generator_mel_loss=19.82, generator_kl_loss=1.945, generator_dur_loss=1.477, generator_adv_loss=2.416, generator_feat_match_loss=4.454, over 429.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 14:08:52,156 INFO [train.py:845] (3/4) Start epoch 487 2024-02-23 14:10:38,865 INFO [train.py:471] (3/4) Epoch 487, batch 18, global_batch_idx: 18000, batch size: 52, loss[discriminator_loss=3.594, discriminator_real_loss=2.268, discriminator_fake_loss=1.325, generator_loss=27.81, generator_mel_loss=19.43, generator_kl_loss=1.959, generator_dur_loss=1.502, generator_adv_loss=2.182, generator_feat_match_loss=2.738, over 52.00 samples.], tot_loss[discriminator_loss=2.655, discriminator_real_loss=1.36, discriminator_fake_loss=1.296, generator_loss=30.16, generator_mel_loss=20, generator_kl_loss=1.93, generator_dur_loss=1.485, generator_adv_loss=2.453, generator_feat_match_loss=4.292, over 1370.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 14:10:38,866 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 14:10:46,995 INFO [train.py:534] (3/4) Epoch 487, validation: discriminator_loss=3.241, discriminator_real_loss=1.899, discriminator_fake_loss=1.342, generator_loss=29.05, generator_mel_loss=20.51, generator_kl_loss=2.022, generator_dur_loss=1.481, generator_adv_loss=2.05, generator_feat_match_loss=2.987, over 100.00 samples. 2024-02-23 14:10:46,996 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 14:12:23,184 INFO [train.py:845] (3/4) Start epoch 488 2024-02-23 14:15:30,066 INFO [train.py:471] (3/4) Epoch 488, batch 31, global_batch_idx: 18050, batch size: 60, loss[discriminator_loss=2.803, discriminator_real_loss=1.398, discriminator_fake_loss=1.404, generator_loss=28.34, generator_mel_loss=19.86, generator_kl_loss=1.855, generator_dur_loss=1.497, generator_adv_loss=1.983, generator_feat_match_loss=3.146, over 60.00 samples.], tot_loss[discriminator_loss=2.827, discriminator_real_loss=1.471, discriminator_fake_loss=1.356, generator_loss=27.77, generator_mel_loss=19.58, generator_kl_loss=1.911, generator_dur_loss=1.487, generator_adv_loss=1.849, generator_feat_match_loss=2.946, over 2212.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 14:15:57,415 INFO [train.py:845] (3/4) Start epoch 489 2024-02-23 14:19:24,059 INFO [train.py:845] (3/4) Start epoch 490 2024-02-23 14:20:20,938 INFO [train.py:471] (3/4) Epoch 490, batch 7, global_batch_idx: 18100, batch size: 79, loss[discriminator_loss=3.016, discriminator_real_loss=1.491, discriminator_fake_loss=1.523, generator_loss=27.61, generator_mel_loss=19.91, generator_kl_loss=1.848, generator_dur_loss=1.467, generator_adv_loss=1.756, generator_feat_match_loss=2.627, over 79.00 samples.], tot_loss[discriminator_loss=2.753, discriminator_real_loss=1.373, discriminator_fake_loss=1.381, generator_loss=28.56, generator_mel_loss=19.89, generator_kl_loss=1.945, generator_dur_loss=1.485, generator_adv_loss=1.963, generator_feat_match_loss=3.277, over 570.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 14:22:51,556 INFO [train.py:845] (3/4) Start epoch 491 2024-02-23 14:24:42,888 INFO [train.py:471] (3/4) Epoch 491, batch 20, global_batch_idx: 18150, batch size: 59, loss[discriminator_loss=2.77, discriminator_real_loss=1.487, discriminator_fake_loss=1.281, generator_loss=28.11, generator_mel_loss=19.54, generator_kl_loss=2.015, generator_dur_loss=1.493, generator_adv_loss=1.925, generator_feat_match_loss=3.137, over 59.00 samples.], tot_loss[discriminator_loss=2.736, discriminator_real_loss=1.368, discriminator_fake_loss=1.368, generator_loss=28.17, generator_mel_loss=19.64, generator_kl_loss=1.925, generator_dur_loss=1.49, generator_adv_loss=1.919, generator_feat_match_loss=3.199, over 1492.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 14:26:20,061 INFO [train.py:845] (3/4) Start epoch 492 2024-02-23 14:29:32,718 INFO [train.py:471] (3/4) Epoch 492, batch 33, global_batch_idx: 18200, batch size: 59, loss[discriminator_loss=3.043, discriminator_real_loss=1.135, discriminator_fake_loss=1.907, generator_loss=28.33, generator_mel_loss=20.06, generator_kl_loss=1.9, generator_dur_loss=1.485, generator_adv_loss=1.737, generator_feat_match_loss=3.141, over 59.00 samples.], tot_loss[discriminator_loss=2.76, discriminator_real_loss=1.408, discriminator_fake_loss=1.352, generator_loss=28.65, generator_mel_loss=19.92, generator_kl_loss=1.922, generator_dur_loss=1.48, generator_adv_loss=2.006, generator_feat_match_loss=3.321, over 2727.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 14:29:32,720 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 14:29:41,579 INFO [train.py:534] (3/4) Epoch 492, validation: discriminator_loss=2.879, discriminator_real_loss=1.336, discriminator_fake_loss=1.543, generator_loss=29.03, generator_mel_loss=20.52, generator_kl_loss=2.06, generator_dur_loss=1.482, generator_adv_loss=1.687, generator_feat_match_loss=3.281, over 100.00 samples. 2024-02-23 14:29:41,580 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 14:29:59,455 INFO [train.py:845] (3/4) Start epoch 493 2024-02-23 14:33:30,333 INFO [train.py:845] (3/4) Start epoch 494 2024-02-23 14:34:28,236 INFO [train.py:471] (3/4) Epoch 494, batch 9, global_batch_idx: 18250, batch size: 153, loss[discriminator_loss=2.613, discriminator_real_loss=1.273, discriminator_fake_loss=1.341, generator_loss=29.36, generator_mel_loss=20.17, generator_kl_loss=1.928, generator_dur_loss=1.454, generator_adv_loss=2.225, generator_feat_match_loss=3.58, over 153.00 samples.], tot_loss[discriminator_loss=2.632, discriminator_real_loss=1.35, discriminator_fake_loss=1.283, generator_loss=29.68, generator_mel_loss=20.11, generator_kl_loss=1.952, generator_dur_loss=1.48, generator_adv_loss=2.286, generator_feat_match_loss=3.852, over 808.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 14:36:56,909 INFO [train.py:845] (3/4) Start epoch 495 2024-02-23 14:39:00,945 INFO [train.py:471] (3/4) Epoch 495, batch 22, global_batch_idx: 18300, batch size: 85, loss[discriminator_loss=2.777, discriminator_real_loss=1.211, discriminator_fake_loss=1.566, generator_loss=28.94, generator_mel_loss=20.31, generator_kl_loss=1.934, generator_dur_loss=1.465, generator_adv_loss=1.998, generator_feat_match_loss=3.232, over 85.00 samples.], tot_loss[discriminator_loss=2.677, discriminator_real_loss=1.347, discriminator_fake_loss=1.329, generator_loss=29.55, generator_mel_loss=20.3, generator_kl_loss=1.917, generator_dur_loss=1.489, generator_adv_loss=2.164, generator_feat_match_loss=3.677, over 1693.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 14:40:23,217 INFO [train.py:845] (3/4) Start epoch 496 2024-02-23 14:43:48,012 INFO [train.py:471] (3/4) Epoch 496, batch 35, global_batch_idx: 18350, batch size: 63, loss[discriminator_loss=2.617, discriminator_real_loss=1.386, discriminator_fake_loss=1.232, generator_loss=30.15, generator_mel_loss=20.18, generator_kl_loss=1.917, generator_dur_loss=1.507, generator_adv_loss=2.328, generator_feat_match_loss=4.215, over 63.00 samples.], tot_loss[discriminator_loss=2.633, discriminator_real_loss=1.339, discriminator_fake_loss=1.294, generator_loss=29.27, generator_mel_loss=20.06, generator_kl_loss=1.908, generator_dur_loss=1.489, generator_adv_loss=2.147, generator_feat_match_loss=3.672, over 2389.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 14:43:56,108 INFO [train.py:845] (3/4) Start epoch 497 2024-02-23 14:47:26,615 INFO [train.py:845] (3/4) Start epoch 498 2024-02-23 14:48:38,122 INFO [train.py:471] (3/4) Epoch 498, batch 11, global_batch_idx: 18400, batch size: 54, loss[discriminator_loss=2.617, discriminator_real_loss=1.328, discriminator_fake_loss=1.289, generator_loss=29.93, generator_mel_loss=20.08, generator_kl_loss=1.981, generator_dur_loss=1.504, generator_adv_loss=2.418, generator_feat_match_loss=3.953, over 54.00 samples.], tot_loss[discriminator_loss=2.577, discriminator_real_loss=1.285, discriminator_fake_loss=1.292, generator_loss=29.91, generator_mel_loss=20.23, generator_kl_loss=1.907, generator_dur_loss=1.492, generator_adv_loss=2.277, generator_feat_match_loss=4.009, over 799.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 14:48:38,124 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 14:48:45,777 INFO [train.py:534] (3/4) Epoch 498, validation: discriminator_loss=2.567, discriminator_real_loss=1.163, discriminator_fake_loss=1.404, generator_loss=30.24, generator_mel_loss=21.03, generator_kl_loss=2.033, generator_dur_loss=1.483, generator_adv_loss=1.825, generator_feat_match_loss=3.877, over 100.00 samples. 2024-02-23 14:48:45,778 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 14:51:04,717 INFO [train.py:845] (3/4) Start epoch 499 2024-02-23 14:53:23,348 INFO [train.py:471] (3/4) Epoch 499, batch 24, global_batch_idx: 18450, batch size: 69, loss[discriminator_loss=2.568, discriminator_real_loss=1.372, discriminator_fake_loss=1.196, generator_loss=30.76, generator_mel_loss=20.26, generator_kl_loss=1.846, generator_dur_loss=1.479, generator_adv_loss=2.551, generator_feat_match_loss=4.621, over 69.00 samples.], tot_loss[discriminator_loss=2.59, discriminator_real_loss=1.302, discriminator_fake_loss=1.288, generator_loss=29.8, generator_mel_loss=20.27, generator_kl_loss=1.937, generator_dur_loss=1.488, generator_adv_loss=2.21, generator_feat_match_loss=3.895, over 1708.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 14:54:33,409 INFO [train.py:845] (3/4) Start epoch 500 2024-02-23 14:57:59,444 INFO [train.py:845] (3/4) Start epoch 501 2024-02-23 14:58:11,296 INFO [train.py:471] (3/4) Epoch 501, batch 0, global_batch_idx: 18500, batch size: 60, loss[discriminator_loss=2.586, discriminator_real_loss=1.354, discriminator_fake_loss=1.232, generator_loss=29.87, generator_mel_loss=20.47, generator_kl_loss=1.972, generator_dur_loss=1.492, generator_adv_loss=2.191, generator_feat_match_loss=3.746, over 60.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.354, discriminator_fake_loss=1.232, generator_loss=29.87, generator_mel_loss=20.47, generator_kl_loss=1.972, generator_dur_loss=1.492, generator_adv_loss=2.191, generator_feat_match_loss=3.746, over 60.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:01:25,822 INFO [train.py:845] (3/4) Start epoch 502 2024-02-23 15:02:56,365 INFO [train.py:471] (3/4) Epoch 502, batch 13, global_batch_idx: 18550, batch size: 153, loss[discriminator_loss=3.109, discriminator_real_loss=1.379, discriminator_fake_loss=1.729, generator_loss=28.95, generator_mel_loss=20.28, generator_kl_loss=1.871, generator_dur_loss=1.452, generator_adv_loss=1.945, generator_feat_match_loss=3.406, over 153.00 samples.], tot_loss[discriminator_loss=2.73, discriminator_real_loss=1.376, discriminator_fake_loss=1.354, generator_loss=29.68, generator_mel_loss=20.03, generator_kl_loss=1.916, generator_dur_loss=1.482, generator_adv_loss=2.295, generator_feat_match_loss=3.962, over 1049.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:04:57,316 INFO [train.py:845] (3/4) Start epoch 503 2024-02-23 15:07:36,960 INFO [train.py:471] (3/4) Epoch 503, batch 26, global_batch_idx: 18600, batch size: 59, loss[discriminator_loss=2.326, discriminator_real_loss=1.197, discriminator_fake_loss=1.129, generator_loss=30.67, generator_mel_loss=20.36, generator_kl_loss=1.88, generator_dur_loss=1.485, generator_adv_loss=2.326, generator_feat_match_loss=4.609, over 59.00 samples.], tot_loss[discriminator_loss=2.643, discriminator_real_loss=1.334, discriminator_fake_loss=1.308, generator_loss=29.63, generator_mel_loss=20.31, generator_kl_loss=1.922, generator_dur_loss=1.485, generator_adv_loss=2.122, generator_feat_match_loss=3.785, over 1917.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:07:36,962 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 15:07:46,050 INFO [train.py:534] (3/4) Epoch 503, validation: discriminator_loss=2.356, discriminator_real_loss=1.034, discriminator_fake_loss=1.322, generator_loss=31.69, generator_mel_loss=21.55, generator_kl_loss=1.998, generator_dur_loss=1.493, generator_adv_loss=1.977, generator_feat_match_loss=4.664, over 100.00 samples. 2024-02-23 15:07:46,051 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 15:08:34,432 INFO [train.py:845] (3/4) Start epoch 504 2024-02-23 15:12:01,425 INFO [train.py:845] (3/4) Start epoch 505 2024-02-23 15:12:28,064 INFO [train.py:471] (3/4) Epoch 505, batch 2, global_batch_idx: 18650, batch size: 90, loss[discriminator_loss=2.531, discriminator_real_loss=1.296, discriminator_fake_loss=1.234, generator_loss=30.25, generator_mel_loss=20.71, generator_kl_loss=2.021, generator_dur_loss=1.502, generator_adv_loss=2.088, generator_feat_match_loss=3.93, over 90.00 samples.], tot_loss[discriminator_loss=2.612, discriminator_real_loss=1.382, discriminator_fake_loss=1.23, generator_loss=29.48, generator_mel_loss=20.22, generator_kl_loss=1.921, generator_dur_loss=1.497, generator_adv_loss=2.137, generator_feat_match_loss=3.704, over 213.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:15:30,897 INFO [train.py:845] (3/4) Start epoch 506 2024-02-23 15:16:59,831 INFO [train.py:471] (3/4) Epoch 506, batch 15, global_batch_idx: 18700, batch size: 110, loss[discriminator_loss=2.699, discriminator_real_loss=1.308, discriminator_fake_loss=1.393, generator_loss=28.4, generator_mel_loss=19.98, generator_kl_loss=1.917, generator_dur_loss=1.502, generator_adv_loss=1.893, generator_feat_match_loss=3.109, over 110.00 samples.], tot_loss[discriminator_loss=2.624, discriminator_real_loss=1.333, discriminator_fake_loss=1.291, generator_loss=29.71, generator_mel_loss=19.97, generator_kl_loss=1.91, generator_dur_loss=1.485, generator_adv_loss=2.294, generator_feat_match_loss=4.058, over 1181.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:18:59,014 INFO [train.py:845] (3/4) Start epoch 507 2024-02-23 15:21:42,965 INFO [train.py:471] (3/4) Epoch 507, batch 28, global_batch_idx: 18750, batch size: 81, loss[discriminator_loss=2.467, discriminator_real_loss=1.405, discriminator_fake_loss=1.062, generator_loss=30.81, generator_mel_loss=20.38, generator_kl_loss=1.887, generator_dur_loss=1.489, generator_adv_loss=2.533, generator_feat_match_loss=4.52, over 81.00 samples.], tot_loss[discriminator_loss=2.593, discriminator_real_loss=1.309, discriminator_fake_loss=1.284, generator_loss=29.58, generator_mel_loss=20.22, generator_kl_loss=1.899, generator_dur_loss=1.489, generator_adv_loss=2.176, generator_feat_match_loss=3.794, over 2040.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:22:27,695 INFO [train.py:845] (3/4) Start epoch 508 2024-02-23 15:26:00,219 INFO [train.py:845] (3/4) Start epoch 509 2024-02-23 15:26:32,511 INFO [train.py:471] (3/4) Epoch 509, batch 4, global_batch_idx: 18800, batch size: 59, loss[discriminator_loss=2.578, discriminator_real_loss=1.231, discriminator_fake_loss=1.348, generator_loss=29.42, generator_mel_loss=20.25, generator_kl_loss=1.932, generator_dur_loss=1.48, generator_adv_loss=2.092, generator_feat_match_loss=3.668, over 59.00 samples.], tot_loss[discriminator_loss=2.762, discriminator_real_loss=1.444, discriminator_fake_loss=1.317, generator_loss=29.24, generator_mel_loss=20.18, generator_kl_loss=1.916, generator_dur_loss=1.497, generator_adv_loss=2.043, generator_feat_match_loss=3.598, over 318.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 16.0 2024-02-23 15:26:32,514 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 15:26:41,390 INFO [train.py:534] (3/4) Epoch 509, validation: discriminator_loss=2.552, discriminator_real_loss=1.242, discriminator_fake_loss=1.309, generator_loss=30.36, generator_mel_loss=21.21, generator_kl_loss=2.052, generator_dur_loss=1.486, generator_adv_loss=2.01, generator_feat_match_loss=3.603, over 100.00 samples. 2024-02-23 15:26:41,391 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 15:29:35,994 INFO [train.py:845] (3/4) Start epoch 510 2024-02-23 15:31:18,827 INFO [train.py:471] (3/4) Epoch 510, batch 17, global_batch_idx: 18850, batch size: 126, loss[discriminator_loss=2.678, discriminator_real_loss=1.396, discriminator_fake_loss=1.281, generator_loss=31.04, generator_mel_loss=20.49, generator_kl_loss=1.874, generator_dur_loss=1.49, generator_adv_loss=2.672, generator_feat_match_loss=4.512, over 126.00 samples.], tot_loss[discriminator_loss=2.613, discriminator_real_loss=1.317, discriminator_fake_loss=1.296, generator_loss=30.22, generator_mel_loss=20.39, generator_kl_loss=1.925, generator_dur_loss=1.486, generator_adv_loss=2.373, generator_feat_match_loss=4.046, over 1369.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:33:02,437 INFO [train.py:845] (3/4) Start epoch 511 2024-02-23 15:35:59,725 INFO [train.py:471] (3/4) Epoch 511, batch 30, global_batch_idx: 18900, batch size: 53, loss[discriminator_loss=2.723, discriminator_real_loss=1.275, discriminator_fake_loss=1.446, generator_loss=29.39, generator_mel_loss=20.6, generator_kl_loss=1.944, generator_dur_loss=1.508, generator_adv_loss=1.961, generator_feat_match_loss=3.379, over 53.00 samples.], tot_loss[discriminator_loss=2.64, discriminator_real_loss=1.332, discriminator_fake_loss=1.308, generator_loss=29.51, generator_mel_loss=20.26, generator_kl_loss=1.918, generator_dur_loss=1.487, generator_adv_loss=2.146, generator_feat_match_loss=3.699, over 2095.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:36:36,309 INFO [train.py:845] (3/4) Start epoch 512 2024-02-23 15:40:03,169 INFO [train.py:845] (3/4) Start epoch 513 2024-02-23 15:40:52,838 INFO [train.py:471] (3/4) Epoch 513, batch 6, global_batch_idx: 18950, batch size: 154, loss[discriminator_loss=2.316, discriminator_real_loss=1, discriminator_fake_loss=1.316, generator_loss=29.78, generator_mel_loss=20.06, generator_kl_loss=1.818, generator_dur_loss=1.463, generator_adv_loss=2.156, generator_feat_match_loss=4.285, over 154.00 samples.], tot_loss[discriminator_loss=2.509, discriminator_real_loss=1.183, discriminator_fake_loss=1.326, generator_loss=29.17, generator_mel_loss=20.08, generator_kl_loss=1.886, generator_dur_loss=1.469, generator_adv_loss=1.955, generator_feat_match_loss=3.782, over 655.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:43:35,703 INFO [train.py:845] (3/4) Start epoch 514 2024-02-23 15:45:35,054 INFO [train.py:471] (3/4) Epoch 514, batch 19, global_batch_idx: 19000, batch size: 58, loss[discriminator_loss=2.91, discriminator_real_loss=1.732, discriminator_fake_loss=1.177, generator_loss=31.01, generator_mel_loss=20.35, generator_kl_loss=1.907, generator_dur_loss=1.489, generator_adv_loss=2.885, generator_feat_match_loss=4.379, over 58.00 samples.], tot_loss[discriminator_loss=2.625, discriminator_real_loss=1.341, discriminator_fake_loss=1.283, generator_loss=29.96, generator_mel_loss=20.34, generator_kl_loss=1.949, generator_dur_loss=1.493, generator_adv_loss=2.268, generator_feat_match_loss=3.907, over 1228.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:45:35,055 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 15:45:44,706 INFO [train.py:534] (3/4) Epoch 514, validation: discriminator_loss=2.6, discriminator_real_loss=1.433, discriminator_fake_loss=1.167, generator_loss=30.88, generator_mel_loss=21.43, generator_kl_loss=1.978, generator_dur_loss=1.481, generator_adv_loss=2.18, generator_feat_match_loss=3.812, over 100.00 samples. 2024-02-23 15:45:44,707 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 15:47:14,820 INFO [train.py:845] (3/4) Start epoch 515 2024-02-23 15:50:19,807 INFO [train.py:471] (3/4) Epoch 515, batch 32, global_batch_idx: 19050, batch size: 52, loss[discriminator_loss=2.52, discriminator_real_loss=1.326, discriminator_fake_loss=1.194, generator_loss=29.36, generator_mel_loss=20.24, generator_kl_loss=1.976, generator_dur_loss=1.487, generator_adv_loss=2.053, generator_feat_match_loss=3.598, over 52.00 samples.], tot_loss[discriminator_loss=2.646, discriminator_real_loss=1.344, discriminator_fake_loss=1.302, generator_loss=29.88, generator_mel_loss=20.29, generator_kl_loss=1.924, generator_dur_loss=1.483, generator_adv_loss=2.227, generator_feat_match_loss=3.955, over 2381.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:50:47,202 INFO [train.py:845] (3/4) Start epoch 516 2024-02-23 15:54:17,545 INFO [train.py:845] (3/4) Start epoch 517 2024-02-23 15:55:11,523 INFO [train.py:471] (3/4) Epoch 517, batch 8, global_batch_idx: 19100, batch size: 59, loss[discriminator_loss=2.223, discriminator_real_loss=1.08, discriminator_fake_loss=1.144, generator_loss=31.58, generator_mel_loss=20.63, generator_kl_loss=2.022, generator_dur_loss=1.486, generator_adv_loss=2.307, generator_feat_match_loss=5.133, over 59.00 samples.], tot_loss[discriminator_loss=2.695, discriminator_real_loss=1.41, discriminator_fake_loss=1.285, generator_loss=30.27, generator_mel_loss=20.36, generator_kl_loss=1.955, generator_dur_loss=1.484, generator_adv_loss=2.364, generator_feat_match_loss=4.114, over 655.00 samples.], cur_lr_g: 1.88e-04, cur_lr_d: 1.88e-04, grad_scale: 8.0 2024-02-23 15:57:50,221 INFO [train.py:845] (3/4) Start epoch 518 2024-02-23 15:59:53,981 INFO [train.py:471] (3/4) Epoch 518, batch 21, global_batch_idx: 19150, batch size: 65, loss[discriminator_loss=2.781, discriminator_real_loss=1.528, discriminator_fake_loss=1.253, generator_loss=29.78, generator_mel_loss=20.65, generator_kl_loss=1.826, generator_dur_loss=1.499, generator_adv_loss=2.266, generator_feat_match_loss=3.535, over 65.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.302, discriminator_fake_loss=1.313, generator_loss=29.68, generator_mel_loss=20.27, generator_kl_loss=1.894, generator_dur_loss=1.485, generator_adv_loss=2.182, generator_feat_match_loss=3.853, over 1455.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:01:24,348 INFO [train.py:845] (3/4) Start epoch 519 2024-02-23 16:04:39,353 INFO [train.py:471] (3/4) Epoch 519, batch 34, global_batch_idx: 19200, batch size: 55, loss[discriminator_loss=2.451, discriminator_real_loss=1.183, discriminator_fake_loss=1.269, generator_loss=30.2, generator_mel_loss=20.3, generator_kl_loss=1.864, generator_dur_loss=1.524, generator_adv_loss=2.393, generator_feat_match_loss=4.113, over 55.00 samples.], tot_loss[discriminator_loss=2.593, discriminator_real_loss=1.315, discriminator_fake_loss=1.278, generator_loss=29.92, generator_mel_loss=20.44, generator_kl_loss=1.921, generator_dur_loss=1.486, generator_adv_loss=2.176, generator_feat_match_loss=3.896, over 2437.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2024-02-23 16:04:39,354 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 16:04:48,261 INFO [train.py:534] (3/4) Epoch 519, validation: discriminator_loss=2.767, discriminator_real_loss=1.25, discriminator_fake_loss=1.517, generator_loss=29.13, generator_mel_loss=20.57, generator_kl_loss=2.007, generator_dur_loss=1.485, generator_adv_loss=1.685, generator_feat_match_loss=3.382, over 100.00 samples. 2024-02-23 16:04:48,262 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 16:04:56,872 INFO [train.py:845] (3/4) Start epoch 520 2024-02-23 16:08:29,688 INFO [train.py:845] (3/4) Start epoch 521 2024-02-23 16:09:35,824 INFO [train.py:471] (3/4) Epoch 521, batch 10, global_batch_idx: 19250, batch size: 61, loss[discriminator_loss=2.578, discriminator_real_loss=1.225, discriminator_fake_loss=1.354, generator_loss=29.3, generator_mel_loss=19.85, generator_kl_loss=1.986, generator_dur_loss=1.518, generator_adv_loss=2.395, generator_feat_match_loss=3.557, over 61.00 samples.], tot_loss[discriminator_loss=2.629, discriminator_real_loss=1.353, discriminator_fake_loss=1.276, generator_loss=29.91, generator_mel_loss=20.25, generator_kl_loss=1.909, generator_dur_loss=1.488, generator_adv_loss=2.264, generator_feat_match_loss=3.997, over 783.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:11:55,638 INFO [train.py:845] (3/4) Start epoch 522 2024-02-23 16:14:13,424 INFO [train.py:471] (3/4) Epoch 522, batch 23, global_batch_idx: 19300, batch size: 59, loss[discriminator_loss=2.711, discriminator_real_loss=1.36, discriminator_fake_loss=1.351, generator_loss=30.57, generator_mel_loss=20.04, generator_kl_loss=1.882, generator_dur_loss=1.5, generator_adv_loss=2.619, generator_feat_match_loss=4.535, over 59.00 samples.], tot_loss[discriminator_loss=2.617, discriminator_real_loss=1.322, discriminator_fake_loss=1.295, generator_loss=29.94, generator_mel_loss=20.35, generator_kl_loss=1.938, generator_dur_loss=1.483, generator_adv_loss=2.228, generator_feat_match_loss=3.935, over 1841.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:15:18,540 INFO [train.py:845] (3/4) Start epoch 523 2024-02-23 16:18:51,451 INFO [train.py:471] (3/4) Epoch 523, batch 36, global_batch_idx: 19350, batch size: 79, loss[discriminator_loss=2.637, discriminator_real_loss=1.479, discriminator_fake_loss=1.157, generator_loss=30.28, generator_mel_loss=20.11, generator_kl_loss=1.886, generator_dur_loss=1.494, generator_adv_loss=2.613, generator_feat_match_loss=4.168, over 79.00 samples.], tot_loss[discriminator_loss=2.657, discriminator_real_loss=1.365, discriminator_fake_loss=1.293, generator_loss=29.96, generator_mel_loss=20.29, generator_kl_loss=1.909, generator_dur_loss=1.479, generator_adv_loss=2.263, generator_feat_match_loss=4.024, over 2862.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:18:51,953 INFO [train.py:845] (3/4) Start epoch 524 2024-02-23 16:22:16,142 INFO [train.py:845] (3/4) Start epoch 525 2024-02-23 16:23:34,764 INFO [train.py:471] (3/4) Epoch 525, batch 12, global_batch_idx: 19400, batch size: 69, loss[discriminator_loss=2.625, discriminator_real_loss=1.412, discriminator_fake_loss=1.213, generator_loss=29.49, generator_mel_loss=20.44, generator_kl_loss=1.886, generator_dur_loss=1.494, generator_adv_loss=2.02, generator_feat_match_loss=3.652, over 69.00 samples.], tot_loss[discriminator_loss=2.635, discriminator_real_loss=1.35, discriminator_fake_loss=1.285, generator_loss=29.6, generator_mel_loss=20.38, generator_kl_loss=1.899, generator_dur_loss=1.489, generator_adv_loss=2.084, generator_feat_match_loss=3.747, over 848.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:23:34,765 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 16:23:43,239 INFO [train.py:534] (3/4) Epoch 525, validation: discriminator_loss=2.588, discriminator_real_loss=1.301, discriminator_fake_loss=1.287, generator_loss=30.91, generator_mel_loss=21.35, generator_kl_loss=2.008, generator_dur_loss=1.478, generator_adv_loss=2.089, generator_feat_match_loss=3.989, over 100.00 samples. 2024-02-23 16:23:43,240 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 16:25:55,014 INFO [train.py:845] (3/4) Start epoch 526 2024-02-23 16:28:24,315 INFO [train.py:471] (3/4) Epoch 526, batch 25, global_batch_idx: 19450, batch size: 60, loss[discriminator_loss=3.092, discriminator_real_loss=1.762, discriminator_fake_loss=1.33, generator_loss=29.53, generator_mel_loss=20.52, generator_kl_loss=1.924, generator_dur_loss=1.499, generator_adv_loss=2.27, generator_feat_match_loss=3.318, over 60.00 samples.], tot_loss[discriminator_loss=2.577, discriminator_real_loss=1.296, discriminator_fake_loss=1.281, generator_loss=30.37, generator_mel_loss=20.29, generator_kl_loss=1.909, generator_dur_loss=1.487, generator_adv_loss=2.41, generator_feat_match_loss=4.267, over 1750.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:29:24,286 INFO [train.py:845] (3/4) Start epoch 527 2024-02-23 16:32:52,126 INFO [train.py:845] (3/4) Start epoch 528 2024-02-23 16:33:08,127 INFO [train.py:471] (3/4) Epoch 528, batch 1, global_batch_idx: 19500, batch size: 49, loss[discriminator_loss=2.627, discriminator_real_loss=1.315, discriminator_fake_loss=1.312, generator_loss=29.48, generator_mel_loss=20.2, generator_kl_loss=1.954, generator_dur_loss=1.494, generator_adv_loss=2.094, generator_feat_match_loss=3.734, over 49.00 samples.], tot_loss[discriminator_loss=2.616, discriminator_real_loss=1.305, discriminator_fake_loss=1.311, generator_loss=29.45, generator_mel_loss=20.27, generator_kl_loss=1.942, generator_dur_loss=1.488, generator_adv_loss=2.008, generator_feat_match_loss=3.75, over 122.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:36:20,491 INFO [train.py:845] (3/4) Start epoch 529 2024-02-23 16:37:55,658 INFO [train.py:471] (3/4) Epoch 529, batch 14, global_batch_idx: 19550, batch size: 73, loss[discriminator_loss=2.463, discriminator_real_loss=1.188, discriminator_fake_loss=1.275, generator_loss=30.14, generator_mel_loss=20.3, generator_kl_loss=1.882, generator_dur_loss=1.484, generator_adv_loss=2.398, generator_feat_match_loss=4.07, over 73.00 samples.], tot_loss[discriminator_loss=2.573, discriminator_real_loss=1.279, discriminator_fake_loss=1.294, generator_loss=30.23, generator_mel_loss=20.31, generator_kl_loss=1.883, generator_dur_loss=1.487, generator_adv_loss=2.327, generator_feat_match_loss=4.23, over 1024.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:39:55,185 INFO [train.py:845] (3/4) Start epoch 530 2024-02-23 16:42:34,522 INFO [train.py:471] (3/4) Epoch 530, batch 27, global_batch_idx: 19600, batch size: 101, loss[discriminator_loss=2.643, discriminator_real_loss=1.426, discriminator_fake_loss=1.217, generator_loss=29.38, generator_mel_loss=20.27, generator_kl_loss=1.845, generator_dur_loss=1.473, generator_adv_loss=2.217, generator_feat_match_loss=3.57, over 101.00 samples.], tot_loss[discriminator_loss=2.661, discriminator_real_loss=1.35, discriminator_fake_loss=1.311, generator_loss=29.67, generator_mel_loss=20.18, generator_kl_loss=1.906, generator_dur_loss=1.483, generator_adv_loss=2.193, generator_feat_match_loss=3.909, over 2151.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2024-02-23 16:42:34,523 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 16:42:42,771 INFO [train.py:534] (3/4) Epoch 530, validation: discriminator_loss=2.578, discriminator_real_loss=1.386, discriminator_fake_loss=1.192, generator_loss=30.71, generator_mel_loss=21.15, generator_kl_loss=1.929, generator_dur_loss=1.484, generator_adv_loss=2.28, generator_feat_match_loss=3.873, over 100.00 samples. 2024-02-23 16:42:42,772 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 16:43:33,258 INFO [train.py:845] (3/4) Start epoch 531 2024-02-23 16:47:00,561 INFO [train.py:845] (3/4) Start epoch 532 2024-02-23 16:47:29,100 INFO [train.py:471] (3/4) Epoch 532, batch 3, global_batch_idx: 19650, batch size: 101, loss[discriminator_loss=2.613, discriminator_real_loss=1.144, discriminator_fake_loss=1.469, generator_loss=30.69, generator_mel_loss=20.74, generator_kl_loss=1.84, generator_dur_loss=1.473, generator_adv_loss=2.533, generator_feat_match_loss=4.109, over 101.00 samples.], tot_loss[discriminator_loss=2.568, discriminator_real_loss=1.237, discriminator_fake_loss=1.331, generator_loss=30.89, generator_mel_loss=20.75, generator_kl_loss=1.859, generator_dur_loss=1.473, generator_adv_loss=2.398, generator_feat_match_loss=4.412, over 323.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:50:26,654 INFO [train.py:845] (3/4) Start epoch 533 2024-02-23 16:52:10,182 INFO [train.py:471] (3/4) Epoch 533, batch 16, global_batch_idx: 19700, batch size: 54, loss[discriminator_loss=2.609, discriminator_real_loss=1.432, discriminator_fake_loss=1.178, generator_loss=29.77, generator_mel_loss=20.2, generator_kl_loss=1.914, generator_dur_loss=1.512, generator_adv_loss=2.203, generator_feat_match_loss=3.939, over 54.00 samples.], tot_loss[discriminator_loss=2.612, discriminator_real_loss=1.32, discriminator_fake_loss=1.291, generator_loss=29.9, generator_mel_loss=20.25, generator_kl_loss=1.904, generator_dur_loss=1.477, generator_adv_loss=2.25, generator_feat_match_loss=4.017, over 1263.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:53:58,088 INFO [train.py:845] (3/4) Start epoch 534 2024-02-23 16:56:46,340 INFO [train.py:471] (3/4) Epoch 534, batch 29, global_batch_idx: 19750, batch size: 50, loss[discriminator_loss=2.938, discriminator_real_loss=1.31, discriminator_fake_loss=1.629, generator_loss=27.97, generator_mel_loss=19.44, generator_kl_loss=1.84, generator_dur_loss=1.491, generator_adv_loss=2.023, generator_feat_match_loss=3.182, over 50.00 samples.], tot_loss[discriminator_loss=2.587, discriminator_real_loss=1.297, discriminator_fake_loss=1.289, generator_loss=30.08, generator_mel_loss=20.28, generator_kl_loss=1.917, generator_dur_loss=1.486, generator_adv_loss=2.269, generator_feat_match_loss=4.12, over 1968.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 16:57:25,022 INFO [train.py:845] (3/4) Start epoch 535 2024-02-23 17:00:57,875 INFO [train.py:845] (3/4) Start epoch 536 2024-02-23 17:01:35,586 INFO [train.py:471] (3/4) Epoch 536, batch 5, global_batch_idx: 19800, batch size: 50, loss[discriminator_loss=2.586, discriminator_real_loss=1.382, discriminator_fake_loss=1.203, generator_loss=30.31, generator_mel_loss=20.22, generator_kl_loss=1.922, generator_dur_loss=1.516, generator_adv_loss=2.406, generator_feat_match_loss=4.242, over 50.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.292, discriminator_fake_loss=1.279, generator_loss=30.32, generator_mel_loss=20.37, generator_kl_loss=1.885, generator_dur_loss=1.491, generator_adv_loss=2.356, generator_feat_match_loss=4.215, over 341.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:01:35,588 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 17:01:44,741 INFO [train.py:534] (3/4) Epoch 536, validation: discriminator_loss=2.498, discriminator_real_loss=1.277, discriminator_fake_loss=1.22, generator_loss=31.39, generator_mel_loss=21.14, generator_kl_loss=2.032, generator_dur_loss=1.48, generator_adv_loss=2.483, generator_feat_match_loss=4.258, over 100.00 samples. 2024-02-23 17:01:44,742 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 17:04:39,964 INFO [train.py:845] (3/4) Start epoch 537 2024-02-23 17:06:25,834 INFO [train.py:471] (3/4) Epoch 537, batch 18, global_batch_idx: 19850, batch size: 85, loss[discriminator_loss=2.568, discriminator_real_loss=1.298, discriminator_fake_loss=1.271, generator_loss=28.93, generator_mel_loss=20.03, generator_kl_loss=1.887, generator_dur_loss=1.489, generator_adv_loss=2.025, generator_feat_match_loss=3.5, over 85.00 samples.], tot_loss[discriminator_loss=2.633, discriminator_real_loss=1.334, discriminator_fake_loss=1.299, generator_loss=29.96, generator_mel_loss=20.38, generator_kl_loss=1.904, generator_dur_loss=1.481, generator_adv_loss=2.199, generator_feat_match_loss=3.991, over 1518.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:08:03,326 INFO [train.py:845] (3/4) Start epoch 538 2024-02-23 17:11:02,434 INFO [train.py:471] (3/4) Epoch 538, batch 31, global_batch_idx: 19900, batch size: 110, loss[discriminator_loss=2.98, discriminator_real_loss=1.561, discriminator_fake_loss=1.419, generator_loss=29.41, generator_mel_loss=20.67, generator_kl_loss=1.85, generator_dur_loss=1.466, generator_adv_loss=1.906, generator_feat_match_loss=3.525, over 110.00 samples.], tot_loss[discriminator_loss=2.608, discriminator_real_loss=1.318, discriminator_fake_loss=1.29, generator_loss=30.36, generator_mel_loss=20.43, generator_kl_loss=1.929, generator_dur_loss=1.48, generator_adv_loss=2.29, generator_feat_match_loss=4.232, over 2588.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:11:34,932 INFO [train.py:845] (3/4) Start epoch 539 2024-02-23 17:15:04,986 INFO [train.py:845] (3/4) Start epoch 540 2024-02-23 17:16:02,504 INFO [train.py:471] (3/4) Epoch 540, batch 7, global_batch_idx: 19950, batch size: 56, loss[discriminator_loss=2.621, discriminator_real_loss=1.276, discriminator_fake_loss=1.345, generator_loss=29.78, generator_mel_loss=20.18, generator_kl_loss=1.943, generator_dur_loss=1.479, generator_adv_loss=2.203, generator_feat_match_loss=3.982, over 56.00 samples.], tot_loss[discriminator_loss=2.578, discriminator_real_loss=1.283, discriminator_fake_loss=1.295, generator_loss=29.86, generator_mel_loss=20.26, generator_kl_loss=1.916, generator_dur_loss=1.482, generator_adv_loss=2.193, generator_feat_match_loss=4.014, over 594.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:18:31,566 INFO [train.py:845] (3/4) Start epoch 541 2024-02-23 17:20:31,847 INFO [train.py:471] (3/4) Epoch 541, batch 20, global_batch_idx: 20000, batch size: 73, loss[discriminator_loss=2.488, discriminator_real_loss=1.266, discriminator_fake_loss=1.222, generator_loss=30.45, generator_mel_loss=20.06, generator_kl_loss=1.944, generator_dur_loss=1.481, generator_adv_loss=2.35, generator_feat_match_loss=4.613, over 73.00 samples.], tot_loss[discriminator_loss=2.61, discriminator_real_loss=1.351, discriminator_fake_loss=1.259, generator_loss=29.99, generator_mel_loss=20.1, generator_kl_loss=1.895, generator_dur_loss=1.486, generator_adv_loss=2.307, generator_feat_match_loss=4.199, over 1578.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2024-02-23 17:20:31,849 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 17:20:40,075 INFO [train.py:534] (3/4) Epoch 541, validation: discriminator_loss=2.334, discriminator_real_loss=1.144, discriminator_fake_loss=1.19, generator_loss=31.42, generator_mel_loss=21, generator_kl_loss=2.015, generator_dur_loss=1.483, generator_adv_loss=2.197, generator_feat_match_loss=4.727, over 100.00 samples. 2024-02-23 17:20:40,076 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 17:22:10,813 INFO [train.py:845] (3/4) Start epoch 542 2024-02-23 17:25:22,890 INFO [train.py:471] (3/4) Epoch 542, batch 33, global_batch_idx: 20050, batch size: 67, loss[discriminator_loss=2.59, discriminator_real_loss=1.419, discriminator_fake_loss=1.172, generator_loss=30.48, generator_mel_loss=20.65, generator_kl_loss=1.957, generator_dur_loss=1.489, generator_adv_loss=2.113, generator_feat_match_loss=4.273, over 67.00 samples.], tot_loss[discriminator_loss=2.592, discriminator_real_loss=1.324, discriminator_fake_loss=1.268, generator_loss=29.91, generator_mel_loss=20.34, generator_kl_loss=1.917, generator_dur_loss=1.478, generator_adv_loss=2.181, generator_feat_match_loss=3.996, over 2431.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:25:39,075 INFO [train.py:845] (3/4) Start epoch 543 2024-02-23 17:29:10,450 INFO [train.py:845] (3/4) Start epoch 544 2024-02-23 17:30:11,861 INFO [train.py:471] (3/4) Epoch 544, batch 9, global_batch_idx: 20100, batch size: 73, loss[discriminator_loss=2.723, discriminator_real_loss=1.512, discriminator_fake_loss=1.211, generator_loss=29.61, generator_mel_loss=20.02, generator_kl_loss=1.912, generator_dur_loss=1.466, generator_adv_loss=2.375, generator_feat_match_loss=3.832, over 73.00 samples.], tot_loss[discriminator_loss=2.665, discriminator_real_loss=1.377, discriminator_fake_loss=1.288, generator_loss=29.88, generator_mel_loss=20.23, generator_kl_loss=1.892, generator_dur_loss=1.481, generator_adv_loss=2.242, generator_feat_match_loss=4.027, over 673.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:32:43,815 INFO [train.py:845] (3/4) Start epoch 545 2024-02-23 17:34:57,799 INFO [train.py:471] (3/4) Epoch 545, batch 22, global_batch_idx: 20150, batch size: 55, loss[discriminator_loss=2.719, discriminator_real_loss=1.475, discriminator_fake_loss=1.245, generator_loss=29.33, generator_mel_loss=20.24, generator_kl_loss=1.888, generator_dur_loss=1.512, generator_adv_loss=1.977, generator_feat_match_loss=3.711, over 55.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.348, discriminator_fake_loss=1.278, generator_loss=29.7, generator_mel_loss=20.11, generator_kl_loss=1.902, generator_dur_loss=1.478, generator_adv_loss=2.166, generator_feat_match_loss=4.046, over 1687.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:36:17,426 INFO [train.py:845] (3/4) Start epoch 546 2024-02-23 17:39:41,165 INFO [train.py:471] (3/4) Epoch 546, batch 35, global_batch_idx: 20200, batch size: 64, loss[discriminator_loss=2.59, discriminator_real_loss=1.386, discriminator_fake_loss=1.205, generator_loss=30.46, generator_mel_loss=20.42, generator_kl_loss=1.81, generator_dur_loss=1.482, generator_adv_loss=2.311, generator_feat_match_loss=4.438, over 64.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.326, discriminator_fake_loss=1.297, generator_loss=29.89, generator_mel_loss=20.37, generator_kl_loss=1.904, generator_dur_loss=1.482, generator_adv_loss=2.147, generator_feat_match_loss=3.978, over 2788.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:39:41,166 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 17:39:49,528 INFO [train.py:534] (3/4) Epoch 546, validation: discriminator_loss=2.47, discriminator_real_loss=1.181, discriminator_fake_loss=1.289, generator_loss=30.19, generator_mel_loss=20.64, generator_kl_loss=2.002, generator_dur_loss=1.483, generator_adv_loss=2.068, generator_feat_match_loss=3.988, over 100.00 samples. 2024-02-23 17:39:49,529 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 17:39:53,881 INFO [train.py:845] (3/4) Start epoch 547 2024-02-23 17:43:27,335 INFO [train.py:845] (3/4) Start epoch 548 2024-02-23 17:44:38,651 INFO [train.py:471] (3/4) Epoch 548, batch 11, global_batch_idx: 20250, batch size: 61, loss[discriminator_loss=2.557, discriminator_real_loss=1.213, discriminator_fake_loss=1.344, generator_loss=30.02, generator_mel_loss=20.3, generator_kl_loss=1.854, generator_dur_loss=1.489, generator_adv_loss=2.141, generator_feat_match_loss=4.242, over 61.00 samples.], tot_loss[discriminator_loss=2.562, discriminator_real_loss=1.273, discriminator_fake_loss=1.29, generator_loss=29.95, generator_mel_loss=20.26, generator_kl_loss=1.914, generator_dur_loss=1.483, generator_adv_loss=2.211, generator_feat_match_loss=4.077, over 880.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:46:59,680 INFO [train.py:845] (3/4) Start epoch 549 2024-02-23 17:49:18,243 INFO [train.py:471] (3/4) Epoch 549, batch 24, global_batch_idx: 20300, batch size: 64, loss[discriminator_loss=3.004, discriminator_real_loss=1.642, discriminator_fake_loss=1.361, generator_loss=28.82, generator_mel_loss=20.06, generator_kl_loss=1.905, generator_dur_loss=1.478, generator_adv_loss=2.094, generator_feat_match_loss=3.291, over 64.00 samples.], tot_loss[discriminator_loss=2.599, discriminator_real_loss=1.316, discriminator_fake_loss=1.283, generator_loss=29.99, generator_mel_loss=20.04, generator_kl_loss=1.911, generator_dur_loss=1.487, generator_adv_loss=2.296, generator_feat_match_loss=4.256, over 1600.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:50:23,995 INFO [train.py:845] (3/4) Start epoch 550 2024-02-23 17:53:52,715 INFO [train.py:845] (3/4) Start epoch 551 2024-02-23 17:54:06,390 INFO [train.py:471] (3/4) Epoch 551, batch 0, global_batch_idx: 20350, batch size: 85, loss[discriminator_loss=2.617, discriminator_real_loss=1.354, discriminator_fake_loss=1.264, generator_loss=29.79, generator_mel_loss=20.27, generator_kl_loss=1.955, generator_dur_loss=1.471, generator_adv_loss=2.119, generator_feat_match_loss=3.973, over 85.00 samples.], tot_loss[discriminator_loss=2.617, discriminator_real_loss=1.354, discriminator_fake_loss=1.264, generator_loss=29.79, generator_mel_loss=20.27, generator_kl_loss=1.955, generator_dur_loss=1.471, generator_adv_loss=2.119, generator_feat_match_loss=3.973, over 85.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 17:57:26,736 INFO [train.py:845] (3/4) Start epoch 552 2024-02-23 17:58:50,421 INFO [train.py:471] (3/4) Epoch 552, batch 13, global_batch_idx: 20400, batch size: 79, loss[discriminator_loss=2.375, discriminator_real_loss=1.224, discriminator_fake_loss=1.152, generator_loss=29.77, generator_mel_loss=19.72, generator_kl_loss=1.87, generator_dur_loss=1.473, generator_adv_loss=2.174, generator_feat_match_loss=4.527, over 79.00 samples.], tot_loss[discriminator_loss=2.659, discriminator_real_loss=1.392, discriminator_fake_loss=1.267, generator_loss=29.89, generator_mel_loss=20.24, generator_kl_loss=1.887, generator_dur_loss=1.487, generator_adv_loss=2.196, generator_feat_match_loss=4.076, over 943.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2024-02-23 17:58:50,423 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 17:58:59,561 INFO [train.py:534] (3/4) Epoch 552, validation: discriminator_loss=2.474, discriminator_real_loss=1.045, discriminator_fake_loss=1.429, generator_loss=30.46, generator_mel_loss=20.85, generator_kl_loss=2.012, generator_dur_loss=1.487, generator_adv_loss=1.767, generator_feat_match_loss=4.343, over 100.00 samples. 2024-02-23 17:58:59,562 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 18:01:08,093 INFO [train.py:845] (3/4) Start epoch 553 2024-02-23 18:03:45,242 INFO [train.py:471] (3/4) Epoch 553, batch 26, global_batch_idx: 20450, batch size: 63, loss[discriminator_loss=2.441, discriminator_real_loss=1.191, discriminator_fake_loss=1.251, generator_loss=29.75, generator_mel_loss=19.94, generator_kl_loss=1.934, generator_dur_loss=1.479, generator_adv_loss=2.232, generator_feat_match_loss=4.164, over 63.00 samples.], tot_loss[discriminator_loss=2.584, discriminator_real_loss=1.291, discriminator_fake_loss=1.292, generator_loss=29.67, generator_mel_loss=20.26, generator_kl_loss=1.933, generator_dur_loss=1.481, generator_adv_loss=2.089, generator_feat_match_loss=3.907, over 1922.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2024-02-23 18:04:40,834 INFO [train.py:845] (3/4) Start epoch 554 2024-02-23 18:08:06,583 INFO [train.py:845] (3/4) Start epoch 555 2024-02-23 18:08:31,450 INFO [train.py:471] (3/4) Epoch 555, batch 2, global_batch_idx: 20500, batch size: 153, loss[discriminator_loss=2.582, discriminator_real_loss=1.319, discriminator_fake_loss=1.263, generator_loss=30.07, generator_mel_loss=20.24, generator_kl_loss=1.922, generator_dur_loss=1.442, generator_adv_loss=2.256, generator_feat_match_loss=4.215, over 153.00 samples.], tot_loss[discriminator_loss=2.583, discriminator_real_loss=1.324, discriminator_fake_loss=1.259, generator_loss=29.67, generator_mel_loss=20.08, generator_kl_loss=1.918, generator_dur_loss=1.455, generator_adv_loss=2.227, generator_feat_match_loss=3.99, over 273.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 16.0 2024-02-23 18:11:32,085 INFO [train.py:845] (3/4) Start epoch 556 2024-02-23 18:13:05,860 INFO [train.py:471] (3/4) Epoch 556, batch 15, global_batch_idx: 20550, batch size: 95, loss[discriminator_loss=2.664, discriminator_real_loss=1.398, discriminator_fake_loss=1.265, generator_loss=29.6, generator_mel_loss=19.94, generator_kl_loss=1.919, generator_dur_loss=1.458, generator_adv_loss=2.184, generator_feat_match_loss=4.102, over 95.00 samples.], tot_loss[discriminator_loss=2.575, discriminator_real_loss=1.31, discriminator_fake_loss=1.264, generator_loss=29.99, generator_mel_loss=20.01, generator_kl_loss=1.901, generator_dur_loss=1.477, generator_adv_loss=2.285, generator_feat_match_loss=4.309, over 1284.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 18:15:02,194 INFO [train.py:845] (3/4) Start epoch 557 2024-02-23 18:17:43,640 INFO [train.py:471] (3/4) Epoch 557, batch 28, global_batch_idx: 20600, batch size: 101, loss[discriminator_loss=2.475, discriminator_real_loss=1.34, discriminator_fake_loss=1.135, generator_loss=30.93, generator_mel_loss=20.52, generator_kl_loss=1.943, generator_dur_loss=1.478, generator_adv_loss=2.303, generator_feat_match_loss=4.688, over 101.00 samples.], tot_loss[discriminator_loss=2.555, discriminator_real_loss=1.303, discriminator_fake_loss=1.252, generator_loss=30.23, generator_mel_loss=20.32, generator_kl_loss=1.914, generator_dur_loss=1.482, generator_adv_loss=2.278, generator_feat_match_loss=4.241, over 2109.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 18:17:43,642 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 18:17:52,537 INFO [train.py:534] (3/4) Epoch 557, validation: discriminator_loss=2.607, discriminator_real_loss=1.101, discriminator_fake_loss=1.506, generator_loss=30.57, generator_mel_loss=21.3, generator_kl_loss=2.043, generator_dur_loss=1.48, generator_adv_loss=1.69, generator_feat_match_loss=4.053, over 100.00 samples. 2024-02-23 18:17:52,538 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 18:18:35,523 INFO [train.py:845] (3/4) Start epoch 558 2024-02-23 18:22:02,167 INFO [train.py:845] (3/4) Start epoch 559 2024-02-23 18:22:40,413 INFO [train.py:471] (3/4) Epoch 559, batch 4, global_batch_idx: 20650, batch size: 153, loss[discriminator_loss=2.543, discriminator_real_loss=1.23, discriminator_fake_loss=1.313, generator_loss=30.3, generator_mel_loss=20.67, generator_kl_loss=1.832, generator_dur_loss=1.457, generator_adv_loss=2.23, generator_feat_match_loss=4.109, over 153.00 samples.], tot_loss[discriminator_loss=2.588, discriminator_real_loss=1.256, discriminator_fake_loss=1.332, generator_loss=30.11, generator_mel_loss=20.43, generator_kl_loss=1.873, generator_dur_loss=1.473, generator_adv_loss=2.223, generator_feat_match_loss=4.107, over 472.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 18:25:28,579 INFO [train.py:845] (3/4) Start epoch 560 2024-02-23 18:27:12,630 INFO [train.py:471] (3/4) Epoch 560, batch 17, global_batch_idx: 20700, batch size: 54, loss[discriminator_loss=2.609, discriminator_real_loss=1.267, discriminator_fake_loss=1.344, generator_loss=30.16, generator_mel_loss=20.45, generator_kl_loss=1.896, generator_dur_loss=1.492, generator_adv_loss=2.232, generator_feat_match_loss=4.086, over 54.00 samples.], tot_loss[discriminator_loss=2.519, discriminator_real_loss=1.267, discriminator_fake_loss=1.252, generator_loss=30.18, generator_mel_loss=20.04, generator_kl_loss=1.904, generator_dur_loss=1.485, generator_adv_loss=2.356, generator_feat_match_loss=4.394, over 1233.00 samples.], cur_lr_g: 1.87e-04, cur_lr_d: 1.87e-04, grad_scale: 8.0 2024-02-23 18:28:59,883 INFO [train.py:845] (3/4) Start epoch 561 2024-02-23 18:31:52,341 INFO [train.py:471] (3/4) Epoch 561, batch 30, global_batch_idx: 20750, batch size: 85, loss[discriminator_loss=2.539, discriminator_real_loss=1.218, discriminator_fake_loss=1.32, generator_loss=29.89, generator_mel_loss=19.86, generator_kl_loss=1.881, generator_dur_loss=1.485, generator_adv_loss=2.371, generator_feat_match_loss=4.293, over 85.00 samples.], tot_loss[discriminator_loss=2.562, discriminator_real_loss=1.282, discriminator_fake_loss=1.281, generator_loss=30.28, generator_mel_loss=20.36, generator_kl_loss=1.898, generator_dur_loss=1.486, generator_adv_loss=2.29, generator_feat_match_loss=4.25, over 2225.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 18:32:29,595 INFO [train.py:845] (3/4) Start epoch 562 2024-02-23 18:35:56,707 INFO [train.py:845] (3/4) Start epoch 563 2024-02-23 18:36:41,426 INFO [train.py:471] (3/4) Epoch 563, batch 6, global_batch_idx: 20800, batch size: 59, loss[discriminator_loss=2.574, discriminator_real_loss=1.268, discriminator_fake_loss=1.307, generator_loss=29.65, generator_mel_loss=20.15, generator_kl_loss=1.939, generator_dur_loss=1.491, generator_adv_loss=2.113, generator_feat_match_loss=3.959, over 59.00 samples.], tot_loss[discriminator_loss=2.618, discriminator_real_loss=1.349, discriminator_fake_loss=1.269, generator_loss=29.98, generator_mel_loss=20.25, generator_kl_loss=1.948, generator_dur_loss=1.485, generator_adv_loss=2.225, generator_feat_match_loss=4.071, over 475.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2024-02-23 18:36:41,428 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 18:36:50,247 INFO [train.py:534] (3/4) Epoch 563, validation: discriminator_loss=2.591, discriminator_real_loss=1.133, discriminator_fake_loss=1.458, generator_loss=30.62, generator_mel_loss=21.25, generator_kl_loss=1.977, generator_dur_loss=1.474, generator_adv_loss=1.843, generator_feat_match_loss=4.08, over 100.00 samples. 2024-02-23 18:36:50,249 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 18:39:39,147 INFO [train.py:845] (3/4) Start epoch 564 2024-02-23 18:41:38,073 INFO [train.py:471] (3/4) Epoch 564, batch 19, global_batch_idx: 20850, batch size: 110, loss[discriminator_loss=2.645, discriminator_real_loss=1.382, discriminator_fake_loss=1.262, generator_loss=30.81, generator_mel_loss=20.18, generator_kl_loss=1.92, generator_dur_loss=1.473, generator_adv_loss=2.602, generator_feat_match_loss=4.637, over 110.00 samples.], tot_loss[discriminator_loss=2.619, discriminator_real_loss=1.312, discriminator_fake_loss=1.308, generator_loss=30.05, generator_mel_loss=20.15, generator_kl_loss=1.896, generator_dur_loss=1.476, generator_adv_loss=2.305, generator_feat_match_loss=4.219, over 1494.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 18:43:09,233 INFO [train.py:845] (3/4) Start epoch 565 2024-02-23 18:46:14,422 INFO [train.py:471] (3/4) Epoch 565, batch 32, global_batch_idx: 20900, batch size: 73, loss[discriminator_loss=2.41, discriminator_real_loss=1.227, discriminator_fake_loss=1.184, generator_loss=30.54, generator_mel_loss=20.17, generator_kl_loss=1.988, generator_dur_loss=1.449, generator_adv_loss=2.254, generator_feat_match_loss=4.676, over 73.00 samples.], tot_loss[discriminator_loss=2.589, discriminator_real_loss=1.322, discriminator_fake_loss=1.267, generator_loss=30.19, generator_mel_loss=20.43, generator_kl_loss=1.925, generator_dur_loss=1.482, generator_adv_loss=2.233, generator_feat_match_loss=4.117, over 2381.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 18:46:37,709 INFO [train.py:845] (3/4) Start epoch 566 2024-02-23 18:50:03,096 INFO [train.py:845] (3/4) Start epoch 567 2024-02-23 18:50:57,740 INFO [train.py:471] (3/4) Epoch 567, batch 8, global_batch_idx: 20950, batch size: 69, loss[discriminator_loss=2.504, discriminator_real_loss=1.251, discriminator_fake_loss=1.252, generator_loss=30.73, generator_mel_loss=20.34, generator_kl_loss=1.988, generator_dur_loss=1.491, generator_adv_loss=2.283, generator_feat_match_loss=4.633, over 69.00 samples.], tot_loss[discriminator_loss=2.548, discriminator_real_loss=1.315, discriminator_fake_loss=1.233, generator_loss=30.11, generator_mel_loss=20.26, generator_kl_loss=1.944, generator_dur_loss=1.473, generator_adv_loss=2.261, generator_feat_match_loss=4.169, over 615.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 18:53:30,702 INFO [train.py:845] (3/4) Start epoch 568 2024-02-23 18:55:34,844 INFO [train.py:471] (3/4) Epoch 568, batch 21, global_batch_idx: 21000, batch size: 73, loss[discriminator_loss=2.531, discriminator_real_loss=1.197, discriminator_fake_loss=1.335, generator_loss=30.45, generator_mel_loss=20.01, generator_kl_loss=1.941, generator_dur_loss=1.48, generator_adv_loss=2.236, generator_feat_match_loss=4.785, over 73.00 samples.], tot_loss[discriminator_loss=2.539, discriminator_real_loss=1.296, discriminator_fake_loss=1.243, generator_loss=30.36, generator_mel_loss=20.16, generator_kl_loss=1.902, generator_dur_loss=1.478, generator_adv_loss=2.356, generator_feat_match_loss=4.466, over 1668.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 18:55:34,846 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 18:55:43,202 INFO [train.py:534] (3/4) Epoch 568, validation: discriminator_loss=2.541, discriminator_real_loss=1.119, discriminator_fake_loss=1.422, generator_loss=29.93, generator_mel_loss=20.55, generator_kl_loss=1.977, generator_dur_loss=1.476, generator_adv_loss=1.76, generator_feat_match_loss=4.171, over 100.00 samples. 2024-02-23 18:55:43,203 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 18:57:07,270 INFO [train.py:845] (3/4) Start epoch 569 2024-02-23 19:00:29,555 INFO [train.py:471] (3/4) Epoch 569, batch 34, global_batch_idx: 21050, batch size: 76, loss[discriminator_loss=2.609, discriminator_real_loss=1.305, discriminator_fake_loss=1.306, generator_loss=29.66, generator_mel_loss=20.29, generator_kl_loss=1.89, generator_dur_loss=1.504, generator_adv_loss=2.031, generator_feat_match_loss=3.938, over 76.00 samples.], tot_loss[discriminator_loss=2.573, discriminator_real_loss=1.3, discriminator_fake_loss=1.273, generator_loss=30.04, generator_mel_loss=20.13, generator_kl_loss=1.922, generator_dur_loss=1.478, generator_adv_loss=2.246, generator_feat_match_loss=4.266, over 2671.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:00:39,533 INFO [train.py:845] (3/4) Start epoch 570 2024-02-23 19:04:09,231 INFO [train.py:845] (3/4) Start epoch 571 2024-02-23 19:05:24,869 INFO [train.py:471] (3/4) Epoch 571, batch 10, global_batch_idx: 21100, batch size: 65, loss[discriminator_loss=2.666, discriminator_real_loss=1.387, discriminator_fake_loss=1.279, generator_loss=29.03, generator_mel_loss=19.84, generator_kl_loss=1.948, generator_dur_loss=1.472, generator_adv_loss=2.121, generator_feat_match_loss=3.65, over 65.00 samples.], tot_loss[discriminator_loss=2.651, discriminator_real_loss=1.328, discriminator_fake_loss=1.323, generator_loss=29.89, generator_mel_loss=20.08, generator_kl_loss=1.907, generator_dur_loss=1.475, generator_adv_loss=2.272, generator_feat_match_loss=4.157, over 850.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:07:35,059 INFO [train.py:845] (3/4) Start epoch 572 2024-02-23 19:09:58,278 INFO [train.py:471] (3/4) Epoch 572, batch 23, global_batch_idx: 21150, batch size: 153, loss[discriminator_loss=2.553, discriminator_real_loss=1.188, discriminator_fake_loss=1.365, generator_loss=30.14, generator_mel_loss=20.19, generator_kl_loss=1.815, generator_dur_loss=1.474, generator_adv_loss=2.016, generator_feat_match_loss=4.641, over 153.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.334, discriminator_fake_loss=1.288, generator_loss=30.18, generator_mel_loss=20.38, generator_kl_loss=1.893, generator_dur_loss=1.474, generator_adv_loss=2.241, generator_feat_match_loss=4.189, over 2086.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:11:07,505 INFO [train.py:845] (3/4) Start epoch 573 2024-02-23 19:14:27,066 INFO [train.py:471] (3/4) Epoch 573, batch 36, global_batch_idx: 21200, batch size: 85, loss[discriminator_loss=2.746, discriminator_real_loss=1.52, discriminator_fake_loss=1.228, generator_loss=30.55, generator_mel_loss=20.17, generator_kl_loss=1.962, generator_dur_loss=1.46, generator_adv_loss=2.604, generator_feat_match_loss=4.359, over 85.00 samples.], tot_loss[discriminator_loss=2.566, discriminator_real_loss=1.288, discriminator_fake_loss=1.278, generator_loss=30.11, generator_mel_loss=20.22, generator_kl_loss=1.896, generator_dur_loss=1.482, generator_adv_loss=2.271, generator_feat_match_loss=4.24, over 2519.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2024-02-23 19:14:27,068 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 19:14:36,149 INFO [train.py:534] (3/4) Epoch 573, validation: discriminator_loss=2.425, discriminator_real_loss=1.277, discriminator_fake_loss=1.148, generator_loss=31.77, generator_mel_loss=20.97, generator_kl_loss=2.009, generator_dur_loss=1.486, generator_adv_loss=2.621, generator_feat_match_loss=4.691, over 100.00 samples. 2024-02-23 19:14:36,150 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 19:14:36,571 INFO [train.py:845] (3/4) Start epoch 574 2024-02-23 19:18:04,829 INFO [train.py:845] (3/4) Start epoch 575 2024-02-23 19:19:25,811 INFO [train.py:471] (3/4) Epoch 575, batch 12, global_batch_idx: 21250, batch size: 126, loss[discriminator_loss=2.547, discriminator_real_loss=1.235, discriminator_fake_loss=1.312, generator_loss=30.18, generator_mel_loss=20.31, generator_kl_loss=1.979, generator_dur_loss=1.453, generator_adv_loss=2.164, generator_feat_match_loss=4.27, over 126.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.319, discriminator_fake_loss=1.303, generator_loss=29.78, generator_mel_loss=20.28, generator_kl_loss=1.943, generator_dur_loss=1.471, generator_adv_loss=2.083, generator_feat_match_loss=4, over 1148.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:21:33,916 INFO [train.py:845] (3/4) Start epoch 576 2024-02-23 19:24:01,160 INFO [train.py:471] (3/4) Epoch 576, batch 25, global_batch_idx: 21300, batch size: 73, loss[discriminator_loss=2.445, discriminator_real_loss=1.271, discriminator_fake_loss=1.176, generator_loss=30.88, generator_mel_loss=20.36, generator_kl_loss=1.878, generator_dur_loss=1.463, generator_adv_loss=2.441, generator_feat_match_loss=4.738, over 73.00 samples.], tot_loss[discriminator_loss=2.549, discriminator_real_loss=1.281, discriminator_fake_loss=1.268, generator_loss=30.26, generator_mel_loss=20.2, generator_kl_loss=1.912, generator_dur_loss=1.478, generator_adv_loss=2.298, generator_feat_match_loss=4.375, over 2022.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:25:07,420 INFO [train.py:845] (3/4) Start epoch 577 2024-02-23 19:28:30,479 INFO [train.py:845] (3/4) Start epoch 578 2024-02-23 19:28:51,109 INFO [train.py:471] (3/4) Epoch 578, batch 1, global_batch_idx: 21350, batch size: 50, loss[discriminator_loss=2.473, discriminator_real_loss=1.331, discriminator_fake_loss=1.141, generator_loss=31.63, generator_mel_loss=20.83, generator_kl_loss=1.873, generator_dur_loss=1.511, generator_adv_loss=2.574, generator_feat_match_loss=4.848, over 50.00 samples.], tot_loss[discriminator_loss=2.682, discriminator_real_loss=1.545, discriminator_fake_loss=1.137, generator_loss=30.9, generator_mel_loss=20.51, generator_kl_loss=1.896, generator_dur_loss=1.504, generator_adv_loss=2.525, generator_feat_match_loss=4.464, over 160.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:32:00,958 INFO [train.py:845] (3/4) Start epoch 579 2024-02-23 19:33:34,739 INFO [train.py:471] (3/4) Epoch 579, batch 14, global_batch_idx: 21400, batch size: 79, loss[discriminator_loss=2.545, discriminator_real_loss=1.299, discriminator_fake_loss=1.246, generator_loss=30.98, generator_mel_loss=20.7, generator_kl_loss=1.924, generator_dur_loss=1.475, generator_adv_loss=2.48, generator_feat_match_loss=4.395, over 79.00 samples.], tot_loss[discriminator_loss=2.58, discriminator_real_loss=1.303, discriminator_fake_loss=1.277, generator_loss=30.4, generator_mel_loss=20.51, generator_kl_loss=1.926, generator_dur_loss=1.471, generator_adv_loss=2.207, generator_feat_match_loss=4.284, over 1279.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:33:34,740 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 19:33:42,647 INFO [train.py:534] (3/4) Epoch 579, validation: discriminator_loss=2.597, discriminator_real_loss=1.286, discriminator_fake_loss=1.311, generator_loss=30.01, generator_mel_loss=20.64, generator_kl_loss=2.079, generator_dur_loss=1.475, generator_adv_loss=1.972, generator_feat_match_loss=3.846, over 100.00 samples. 2024-02-23 19:33:42,648 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 19:35:40,858 INFO [train.py:845] (3/4) Start epoch 580 2024-02-23 19:38:13,315 INFO [train.py:471] (3/4) Epoch 580, batch 27, global_batch_idx: 21450, batch size: 90, loss[discriminator_loss=2.629, discriminator_real_loss=1.32, discriminator_fake_loss=1.31, generator_loss=29.62, generator_mel_loss=20.3, generator_kl_loss=1.979, generator_dur_loss=1.478, generator_adv_loss=2.023, generator_feat_match_loss=3.836, over 90.00 samples.], tot_loss[discriminator_loss=2.527, discriminator_real_loss=1.256, discriminator_fake_loss=1.271, generator_loss=30.18, generator_mel_loss=20.2, generator_kl_loss=1.914, generator_dur_loss=1.481, generator_adv_loss=2.237, generator_feat_match_loss=4.345, over 2089.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:39:05,911 INFO [train.py:845] (3/4) Start epoch 581 2024-02-23 19:42:32,588 INFO [train.py:845] (3/4) Start epoch 582 2024-02-23 19:43:06,761 INFO [train.py:471] (3/4) Epoch 582, batch 3, global_batch_idx: 21500, batch size: 126, loss[discriminator_loss=2.377, discriminator_real_loss=1.196, discriminator_fake_loss=1.181, generator_loss=29.98, generator_mel_loss=19.7, generator_kl_loss=1.98, generator_dur_loss=1.461, generator_adv_loss=2.354, generator_feat_match_loss=4.484, over 126.00 samples.], tot_loss[discriminator_loss=2.489, discriminator_real_loss=1.26, discriminator_fake_loss=1.229, generator_loss=29.78, generator_mel_loss=19.78, generator_kl_loss=1.949, generator_dur_loss=1.478, generator_adv_loss=2.24, generator_feat_match_loss=4.333, over 332.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:46:02,075 INFO [train.py:845] (3/4) Start epoch 583 2024-02-23 19:47:41,970 INFO [train.py:471] (3/4) Epoch 583, batch 16, global_batch_idx: 21550, batch size: 49, loss[discriminator_loss=2.736, discriminator_real_loss=1.322, discriminator_fake_loss=1.414, generator_loss=30.1, generator_mel_loss=20.47, generator_kl_loss=1.902, generator_dur_loss=1.494, generator_adv_loss=2.055, generator_feat_match_loss=4.18, over 49.00 samples.], tot_loss[discriminator_loss=2.565, discriminator_real_loss=1.29, discriminator_fake_loss=1.275, generator_loss=30.51, generator_mel_loss=20.43, generator_kl_loss=1.917, generator_dur_loss=1.477, generator_adv_loss=2.328, generator_feat_match_loss=4.36, over 1274.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:49:31,439 INFO [train.py:845] (3/4) Start epoch 584 2024-02-23 19:52:15,085 INFO [train.py:471] (3/4) Epoch 584, batch 29, global_batch_idx: 21600, batch size: 71, loss[discriminator_loss=2.629, discriminator_real_loss=1.474, discriminator_fake_loss=1.155, generator_loss=30.71, generator_mel_loss=20.64, generator_kl_loss=1.97, generator_dur_loss=1.477, generator_adv_loss=2.445, generator_feat_match_loss=4.18, over 71.00 samples.], tot_loss[discriminator_loss=2.621, discriminator_real_loss=1.338, discriminator_fake_loss=1.283, generator_loss=29.87, generator_mel_loss=20.03, generator_kl_loss=1.91, generator_dur_loss=1.484, generator_adv_loss=2.232, generator_feat_match_loss=4.209, over 2085.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2024-02-23 19:52:15,087 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 19:52:23,404 INFO [train.py:534] (3/4) Epoch 584, validation: discriminator_loss=2.624, discriminator_real_loss=1.448, discriminator_fake_loss=1.176, generator_loss=30.84, generator_mel_loss=20.86, generator_kl_loss=2.021, generator_dur_loss=1.481, generator_adv_loss=2.32, generator_feat_match_loss=4.16, over 100.00 samples. 2024-02-23 19:52:23,405 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 19:53:04,473 INFO [train.py:845] (3/4) Start epoch 585 2024-02-23 19:56:32,860 INFO [train.py:845] (3/4) Start epoch 586 2024-02-23 19:57:13,886 INFO [train.py:471] (3/4) Epoch 586, batch 5, global_batch_idx: 21650, batch size: 52, loss[discriminator_loss=3.32, discriminator_real_loss=1.89, discriminator_fake_loss=1.431, generator_loss=28.79, generator_mel_loss=20.16, generator_kl_loss=1.855, generator_dur_loss=1.497, generator_adv_loss=1.889, generator_feat_match_loss=3.395, over 52.00 samples.], tot_loss[discriminator_loss=2.779, discriminator_real_loss=1.43, discriminator_fake_loss=1.349, generator_loss=29.29, generator_mel_loss=19.83, generator_kl_loss=1.874, generator_dur_loss=1.48, generator_adv_loss=2.189, generator_feat_match_loss=3.918, over 448.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 19:59:57,297 INFO [train.py:845] (3/4) Start epoch 587 2024-02-23 20:01:42,109 INFO [train.py:471] (3/4) Epoch 587, batch 18, global_batch_idx: 21700, batch size: 65, loss[discriminator_loss=2.723, discriminator_real_loss=1.388, discriminator_fake_loss=1.335, generator_loss=29.22, generator_mel_loss=20.18, generator_kl_loss=1.823, generator_dur_loss=1.487, generator_adv_loss=1.992, generator_feat_match_loss=3.73, over 65.00 samples.], tot_loss[discriminator_loss=2.599, discriminator_real_loss=1.323, discriminator_fake_loss=1.276, generator_loss=29.77, generator_mel_loss=20.23, generator_kl_loss=1.911, generator_dur_loss=1.477, generator_adv_loss=2.153, generator_feat_match_loss=4.004, over 1437.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:03:24,489 INFO [train.py:845] (3/4) Start epoch 588 2024-02-23 20:06:21,768 INFO [train.py:471] (3/4) Epoch 588, batch 31, global_batch_idx: 21750, batch size: 53, loss[discriminator_loss=2.52, discriminator_real_loss=1.305, discriminator_fake_loss=1.216, generator_loss=30.27, generator_mel_loss=19.98, generator_kl_loss=2.018, generator_dur_loss=1.487, generator_adv_loss=2.371, generator_feat_match_loss=4.414, over 53.00 samples.], tot_loss[discriminator_loss=2.628, discriminator_real_loss=1.334, discriminator_fake_loss=1.294, generator_loss=29.92, generator_mel_loss=20.25, generator_kl_loss=1.911, generator_dur_loss=1.483, generator_adv_loss=2.211, generator_feat_match_loss=4.059, over 2072.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:06:53,971 INFO [train.py:845] (3/4) Start epoch 589 2024-02-23 20:10:24,327 INFO [train.py:845] (3/4) Start epoch 590 2024-02-23 20:11:14,451 INFO [train.py:471] (3/4) Epoch 590, batch 7, global_batch_idx: 21800, batch size: 67, loss[discriminator_loss=2.619, discriminator_real_loss=1.202, discriminator_fake_loss=1.417, generator_loss=29.63, generator_mel_loss=20.58, generator_kl_loss=1.905, generator_dur_loss=1.492, generator_adv_loss=1.927, generator_feat_match_loss=3.732, over 67.00 samples.], tot_loss[discriminator_loss=2.605, discriminator_real_loss=1.31, discriminator_fake_loss=1.295, generator_loss=29.94, generator_mel_loss=20.34, generator_kl_loss=1.935, generator_dur_loss=1.482, generator_adv_loss=2.13, generator_feat_match_loss=4.056, over 549.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:11:14,451 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 20:11:23,482 INFO [train.py:534] (3/4) Epoch 590, validation: discriminator_loss=2.511, discriminator_real_loss=1.132, discriminator_fake_loss=1.379, generator_loss=30.53, generator_mel_loss=21.17, generator_kl_loss=1.96, generator_dur_loss=1.477, generator_adv_loss=1.949, generator_feat_match_loss=3.97, over 100.00 samples. 2024-02-23 20:11:23,483 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 20:14:02,034 INFO [train.py:845] (3/4) Start epoch 591 2024-02-23 20:16:06,948 INFO [train.py:471] (3/4) Epoch 591, batch 20, global_batch_idx: 21850, batch size: 79, loss[discriminator_loss=2.547, discriminator_real_loss=1.228, discriminator_fake_loss=1.319, generator_loss=29.55, generator_mel_loss=19.99, generator_kl_loss=1.836, generator_dur_loss=1.506, generator_adv_loss=2.178, generator_feat_match_loss=4.039, over 79.00 samples.], tot_loss[discriminator_loss=2.574, discriminator_real_loss=1.308, discriminator_fake_loss=1.265, generator_loss=30.12, generator_mel_loss=20.17, generator_kl_loss=1.879, generator_dur_loss=1.478, generator_adv_loss=2.279, generator_feat_match_loss=4.318, over 1467.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:17:31,022 INFO [train.py:845] (3/4) Start epoch 592 2024-02-23 20:20:45,640 INFO [train.py:471] (3/4) Epoch 592, batch 33, global_batch_idx: 21900, batch size: 153, loss[discriminator_loss=2.33, discriminator_real_loss=1.133, discriminator_fake_loss=1.197, generator_loss=30.83, generator_mel_loss=20.29, generator_kl_loss=1.895, generator_dur_loss=1.458, generator_adv_loss=2.273, generator_feat_match_loss=4.906, over 153.00 samples.], tot_loss[discriminator_loss=2.541, discriminator_real_loss=1.293, discriminator_fake_loss=1.248, generator_loss=30.39, generator_mel_loss=20.16, generator_kl_loss=1.91, generator_dur_loss=1.479, generator_adv_loss=2.328, generator_feat_match_loss=4.514, over 2633.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:21:02,028 INFO [train.py:845] (3/4) Start epoch 593 2024-02-23 20:24:32,414 INFO [train.py:845] (3/4) Start epoch 594 2024-02-23 20:25:37,415 INFO [train.py:471] (3/4) Epoch 594, batch 9, global_batch_idx: 21950, batch size: 49, loss[discriminator_loss=2.637, discriminator_real_loss=1.299, discriminator_fake_loss=1.339, generator_loss=29.88, generator_mel_loss=19.79, generator_kl_loss=1.964, generator_dur_loss=1.5, generator_adv_loss=2.383, generator_feat_match_loss=4.242, over 49.00 samples.], tot_loss[discriminator_loss=2.66, discriminator_real_loss=1.329, discriminator_fake_loss=1.331, generator_loss=29.85, generator_mel_loss=20.43, generator_kl_loss=1.924, generator_dur_loss=1.487, generator_adv_loss=2.096, generator_feat_match_loss=3.913, over 720.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:27:58,895 INFO [train.py:845] (3/4) Start epoch 595 2024-02-23 20:30:11,447 INFO [train.py:471] (3/4) Epoch 595, batch 22, global_batch_idx: 22000, batch size: 52, loss[discriminator_loss=2.461, discriminator_real_loss=1.19, discriminator_fake_loss=1.27, generator_loss=30.86, generator_mel_loss=20.48, generator_kl_loss=1.96, generator_dur_loss=1.504, generator_adv_loss=2.109, generator_feat_match_loss=4.812, over 52.00 samples.], tot_loss[discriminator_loss=2.595, discriminator_real_loss=1.321, discriminator_fake_loss=1.274, generator_loss=30.23, generator_mel_loss=20.39, generator_kl_loss=1.922, generator_dur_loss=1.473, generator_adv_loss=2.218, generator_feat_match_loss=4.222, over 1723.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 16.0 2024-02-23 20:30:11,449 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 20:30:19,781 INFO [train.py:534] (3/4) Epoch 595, validation: discriminator_loss=2.672, discriminator_real_loss=1.005, discriminator_fake_loss=1.666, generator_loss=30.49, generator_mel_loss=21.32, generator_kl_loss=2.019, generator_dur_loss=1.482, generator_adv_loss=1.543, generator_feat_match_loss=4.124, over 100.00 samples. 2024-02-23 20:30:19,782 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 20:31:32,432 INFO [train.py:845] (3/4) Start epoch 596 2024-02-23 20:34:55,014 INFO [train.py:471] (3/4) Epoch 596, batch 35, global_batch_idx: 22050, batch size: 126, loss[discriminator_loss=2.514, discriminator_real_loss=1.15, discriminator_fake_loss=1.363, generator_loss=30.3, generator_mel_loss=20.27, generator_kl_loss=1.918, generator_dur_loss=1.479, generator_adv_loss=2.113, generator_feat_match_loss=4.523, over 126.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.313, discriminator_fake_loss=1.273, generator_loss=30.19, generator_mel_loss=20.11, generator_kl_loss=1.896, generator_dur_loss=1.474, generator_adv_loss=2.307, generator_feat_match_loss=4.406, over 2735.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:34:59,746 INFO [train.py:845] (3/4) Start epoch 597 2024-02-23 20:38:28,005 INFO [train.py:845] (3/4) Start epoch 598 2024-02-23 20:39:43,128 INFO [train.py:471] (3/4) Epoch 598, batch 11, global_batch_idx: 22100, batch size: 64, loss[discriminator_loss=2.336, discriminator_real_loss=1.14, discriminator_fake_loss=1.195, generator_loss=30.6, generator_mel_loss=19.7, generator_kl_loss=1.885, generator_dur_loss=1.482, generator_adv_loss=2.359, generator_feat_match_loss=5.18, over 64.00 samples.], tot_loss[discriminator_loss=2.69, discriminator_real_loss=1.448, discriminator_fake_loss=1.242, generator_loss=30.08, generator_mel_loss=20.09, generator_kl_loss=1.887, generator_dur_loss=1.472, generator_adv_loss=2.29, generator_feat_match_loss=4.338, over 907.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:41:59,211 INFO [train.py:845] (3/4) Start epoch 599 2024-02-23 20:44:16,177 INFO [train.py:471] (3/4) Epoch 599, batch 24, global_batch_idx: 22150, batch size: 55, loss[discriminator_loss=2.469, discriminator_real_loss=1.205, discriminator_fake_loss=1.264, generator_loss=30.21, generator_mel_loss=20.21, generator_kl_loss=1.937, generator_dur_loss=1.5, generator_adv_loss=2.252, generator_feat_match_loss=4.312, over 55.00 samples.], tot_loss[discriminator_loss=2.614, discriminator_real_loss=1.314, discriminator_fake_loss=1.3, generator_loss=29.66, generator_mel_loss=20.09, generator_kl_loss=1.921, generator_dur_loss=1.479, generator_adv_loss=2.128, generator_feat_match_loss=4.037, over 1670.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:45:29,625 INFO [train.py:845] (3/4) Start epoch 600 2024-02-23 20:49:02,007 INFO [train.py:845] (3/4) Start epoch 601 2024-02-23 20:49:14,980 INFO [train.py:471] (3/4) Epoch 601, batch 0, global_batch_idx: 22200, batch size: 65, loss[discriminator_loss=2.344, discriminator_real_loss=1.278, discriminator_fake_loss=1.066, generator_loss=30.4, generator_mel_loss=19.74, generator_kl_loss=1.986, generator_dur_loss=1.483, generator_adv_loss=2.473, generator_feat_match_loss=4.723, over 65.00 samples.], tot_loss[discriminator_loss=2.344, discriminator_real_loss=1.278, discriminator_fake_loss=1.066, generator_loss=30.4, generator_mel_loss=19.74, generator_kl_loss=1.986, generator_dur_loss=1.483, generator_adv_loss=2.473, generator_feat_match_loss=4.723, over 65.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:49:14,982 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 20:49:23,929 INFO [train.py:534] (3/4) Epoch 601, validation: discriminator_loss=2.249, discriminator_real_loss=1.067, discriminator_fake_loss=1.181, generator_loss=31.28, generator_mel_loss=20.49, generator_kl_loss=2.151, generator_dur_loss=1.478, generator_adv_loss=2.184, generator_feat_match_loss=4.975, over 100.00 samples. 2024-02-23 20:49:23,930 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 20:52:42,469 INFO [train.py:845] (3/4) Start epoch 602 2024-02-23 20:54:07,426 INFO [train.py:471] (3/4) Epoch 602, batch 13, global_batch_idx: 22250, batch size: 63, loss[discriminator_loss=2.629, discriminator_real_loss=1.199, discriminator_fake_loss=1.429, generator_loss=29.47, generator_mel_loss=19.91, generator_kl_loss=1.904, generator_dur_loss=1.47, generator_adv_loss=2.162, generator_feat_match_loss=4.023, over 63.00 samples.], tot_loss[discriminator_loss=2.613, discriminator_real_loss=1.311, discriminator_fake_loss=1.302, generator_loss=29.76, generator_mel_loss=20.23, generator_kl_loss=1.937, generator_dur_loss=1.479, generator_adv_loss=2.132, generator_feat_match_loss=3.986, over 955.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:56:12,073 INFO [train.py:845] (3/4) Start epoch 603 2024-02-23 20:58:45,498 INFO [train.py:471] (3/4) Epoch 603, batch 26, global_batch_idx: 22300, batch size: 71, loss[discriminator_loss=2.654, discriminator_real_loss=1.303, discriminator_fake_loss=1.352, generator_loss=29.72, generator_mel_loss=20.36, generator_kl_loss=1.913, generator_dur_loss=1.453, generator_adv_loss=2.061, generator_feat_match_loss=3.938, over 71.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.299, discriminator_fake_loss=1.272, generator_loss=30.38, generator_mel_loss=20.32, generator_kl_loss=1.902, generator_dur_loss=1.476, generator_adv_loss=2.292, generator_feat_match_loss=4.388, over 1959.00 samples.], cur_lr_g: 1.86e-04, cur_lr_d: 1.86e-04, grad_scale: 8.0 2024-02-23 20:59:44,187 INFO [train.py:845] (3/4) Start epoch 604 2024-02-23 21:03:11,978 INFO [train.py:845] (3/4) Start epoch 605 2024-02-23 21:03:34,463 INFO [train.py:471] (3/4) Epoch 605, batch 2, global_batch_idx: 22350, batch size: 54, loss[discriminator_loss=2.523, discriminator_real_loss=1.33, discriminator_fake_loss=1.192, generator_loss=30.28, generator_mel_loss=19.86, generator_kl_loss=1.819, generator_dur_loss=1.493, generator_adv_loss=2.375, generator_feat_match_loss=4.738, over 54.00 samples.], tot_loss[discriminator_loss=2.683, discriminator_real_loss=1.377, discriminator_fake_loss=1.306, generator_loss=30.12, generator_mel_loss=20.33, generator_kl_loss=1.924, generator_dur_loss=1.484, generator_adv_loss=2.197, generator_feat_match_loss=4.19, over 179.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:06:41,211 INFO [train.py:845] (3/4) Start epoch 606 2024-02-23 21:08:15,112 INFO [train.py:471] (3/4) Epoch 606, batch 15, global_batch_idx: 22400, batch size: 49, loss[discriminator_loss=2.559, discriminator_real_loss=1.192, discriminator_fake_loss=1.366, generator_loss=30.75, generator_mel_loss=20.76, generator_kl_loss=1.96, generator_dur_loss=1.499, generator_adv_loss=2.17, generator_feat_match_loss=4.367, over 49.00 samples.], tot_loss[discriminator_loss=2.607, discriminator_real_loss=1.326, discriminator_fake_loss=1.282, generator_loss=30.06, generator_mel_loss=20.14, generator_kl_loss=1.927, generator_dur_loss=1.473, generator_adv_loss=2.249, generator_feat_match_loss=4.277, over 1152.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2024-02-23 21:08:15,113 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 21:08:24,433 INFO [train.py:534] (3/4) Epoch 606, validation: discriminator_loss=2.667, discriminator_real_loss=1.265, discriminator_fake_loss=1.403, generator_loss=30.11, generator_mel_loss=20.68, generator_kl_loss=2.129, generator_dur_loss=1.474, generator_adv_loss=1.816, generator_feat_match_loss=4.011, over 100.00 samples. 2024-02-23 21:08:24,434 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 21:10:21,210 INFO [train.py:845] (3/4) Start epoch 607 2024-02-23 21:13:10,588 INFO [train.py:471] (3/4) Epoch 607, batch 28, global_batch_idx: 22450, batch size: 51, loss[discriminator_loss=3, discriminator_real_loss=1.465, discriminator_fake_loss=1.536, generator_loss=29.39, generator_mel_loss=20.27, generator_kl_loss=1.847, generator_dur_loss=1.484, generator_adv_loss=2.045, generator_feat_match_loss=3.744, over 51.00 samples.], tot_loss[discriminator_loss=2.633, discriminator_real_loss=1.36, discriminator_fake_loss=1.273, generator_loss=30.1, generator_mel_loss=20.18, generator_kl_loss=1.908, generator_dur_loss=1.472, generator_adv_loss=2.233, generator_feat_match_loss=4.306, over 2240.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:13:47,245 INFO [train.py:845] (3/4) Start epoch 608 2024-02-23 21:17:15,428 INFO [train.py:845] (3/4) Start epoch 609 2024-02-23 21:17:49,875 INFO [train.py:471] (3/4) Epoch 609, batch 4, global_batch_idx: 22500, batch size: 58, loss[discriminator_loss=2.582, discriminator_real_loss=1.317, discriminator_fake_loss=1.264, generator_loss=29.6, generator_mel_loss=19.98, generator_kl_loss=1.82, generator_dur_loss=1.485, generator_adv_loss=2.129, generator_feat_match_loss=4.188, over 58.00 samples.], tot_loss[discriminator_loss=2.601, discriminator_real_loss=1.327, discriminator_fake_loss=1.273, generator_loss=30.19, generator_mel_loss=20.4, generator_kl_loss=1.927, generator_dur_loss=1.478, generator_adv_loss=2.174, generator_feat_match_loss=4.21, over 343.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:20:47,918 INFO [train.py:845] (3/4) Start epoch 610 2024-02-23 21:22:31,196 INFO [train.py:471] (3/4) Epoch 610, batch 17, global_batch_idx: 22550, batch size: 55, loss[discriminator_loss=2.748, discriminator_real_loss=1.237, discriminator_fake_loss=1.511, generator_loss=29.82, generator_mel_loss=20.39, generator_kl_loss=1.909, generator_dur_loss=1.508, generator_adv_loss=1.983, generator_feat_match_loss=4.035, over 55.00 samples.], tot_loss[discriminator_loss=2.54, discriminator_real_loss=1.265, discriminator_fake_loss=1.275, generator_loss=30.17, generator_mel_loss=20.21, generator_kl_loss=1.918, generator_dur_loss=1.473, generator_adv_loss=2.237, generator_feat_match_loss=4.33, over 1345.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:24:21,313 INFO [train.py:845] (3/4) Start epoch 611 2024-02-23 21:27:24,342 INFO [train.py:471] (3/4) Epoch 611, batch 30, global_batch_idx: 22600, batch size: 63, loss[discriminator_loss=2.652, discriminator_real_loss=1.365, discriminator_fake_loss=1.288, generator_loss=29.63, generator_mel_loss=19.92, generator_kl_loss=1.861, generator_dur_loss=1.485, generator_adv_loss=2.203, generator_feat_match_loss=4.156, over 63.00 samples.], tot_loss[discriminator_loss=2.539, discriminator_real_loss=1.291, discriminator_fake_loss=1.248, generator_loss=30.34, generator_mel_loss=20.05, generator_kl_loss=1.907, generator_dur_loss=1.478, generator_adv_loss=2.359, generator_feat_match_loss=4.548, over 2354.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:27:24,344 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 21:27:32,609 INFO [train.py:534] (3/4) Epoch 611, validation: discriminator_loss=2.492, discriminator_real_loss=1.13, discriminator_fake_loss=1.363, generator_loss=30.45, generator_mel_loss=20.73, generator_kl_loss=2.005, generator_dur_loss=1.475, generator_adv_loss=1.961, generator_feat_match_loss=4.283, over 100.00 samples. 2024-02-23 21:27:32,609 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 21:27:58,746 INFO [train.py:845] (3/4) Start epoch 612 2024-02-23 21:31:26,223 INFO [train.py:845] (3/4) Start epoch 613 2024-02-23 21:32:09,536 INFO [train.py:471] (3/4) Epoch 613, batch 6, global_batch_idx: 22650, batch size: 52, loss[discriminator_loss=2.418, discriminator_real_loss=1.182, discriminator_fake_loss=1.236, generator_loss=29.77, generator_mel_loss=19.61, generator_kl_loss=1.949, generator_dur_loss=1.501, generator_adv_loss=2.273, generator_feat_match_loss=4.43, over 52.00 samples.], tot_loss[discriminator_loss=2.744, discriminator_real_loss=1.395, discriminator_fake_loss=1.349, generator_loss=29.52, generator_mel_loss=19.82, generator_kl_loss=1.92, generator_dur_loss=1.487, generator_adv_loss=2.125, generator_feat_match_loss=4.167, over 463.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:34:54,493 INFO [train.py:845] (3/4) Start epoch 614 2024-02-23 21:36:56,357 INFO [train.py:471] (3/4) Epoch 614, batch 19, global_batch_idx: 22700, batch size: 51, loss[discriminator_loss=2.34, discriminator_real_loss=1.212, discriminator_fake_loss=1.129, generator_loss=30.74, generator_mel_loss=20.09, generator_kl_loss=2.09, generator_dur_loss=1.486, generator_adv_loss=2.311, generator_feat_match_loss=4.762, over 51.00 samples.], tot_loss[discriminator_loss=2.572, discriminator_real_loss=1.304, discriminator_fake_loss=1.268, generator_loss=29.98, generator_mel_loss=20.1, generator_kl_loss=1.907, generator_dur_loss=1.482, generator_adv_loss=2.235, generator_feat_match_loss=4.249, over 1197.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:38:24,363 INFO [train.py:845] (3/4) Start epoch 615 2024-02-23 21:41:27,998 INFO [train.py:471] (3/4) Epoch 615, batch 32, global_batch_idx: 22750, batch size: 126, loss[discriminator_loss=2.502, discriminator_real_loss=1.234, discriminator_fake_loss=1.268, generator_loss=30.51, generator_mel_loss=20.05, generator_kl_loss=1.889, generator_dur_loss=1.481, generator_adv_loss=2.369, generator_feat_match_loss=4.719, over 126.00 samples.], tot_loss[discriminator_loss=2.591, discriminator_real_loss=1.303, discriminator_fake_loss=1.288, generator_loss=30.26, generator_mel_loss=20.18, generator_kl_loss=1.9, generator_dur_loss=1.48, generator_adv_loss=2.309, generator_feat_match_loss=4.392, over 2370.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:41:52,212 INFO [train.py:845] (3/4) Start epoch 616 2024-02-23 21:45:22,659 INFO [train.py:845] (3/4) Start epoch 617 2024-02-23 21:46:17,271 INFO [train.py:471] (3/4) Epoch 617, batch 8, global_batch_idx: 22800, batch size: 52, loss[discriminator_loss=2.582, discriminator_real_loss=1.392, discriminator_fake_loss=1.191, generator_loss=29.96, generator_mel_loss=20.25, generator_kl_loss=1.942, generator_dur_loss=1.475, generator_adv_loss=2.113, generator_feat_match_loss=4.172, over 52.00 samples.], tot_loss[discriminator_loss=2.612, discriminator_real_loss=1.334, discriminator_fake_loss=1.278, generator_loss=29.87, generator_mel_loss=20.2, generator_kl_loss=1.933, generator_dur_loss=1.476, generator_adv_loss=2.133, generator_feat_match_loss=4.136, over 630.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2024-02-23 21:46:17,273 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 21:46:25,933 INFO [train.py:534] (3/4) Epoch 617, validation: discriminator_loss=2.678, discriminator_real_loss=1.22, discriminator_fake_loss=1.458, generator_loss=29.99, generator_mel_loss=20.75, generator_kl_loss=2.099, generator_dur_loss=1.477, generator_adv_loss=1.742, generator_feat_match_loss=3.918, over 100.00 samples. 2024-02-23 21:46:25,934 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 21:49:06,423 INFO [train.py:845] (3/4) Start epoch 618 2024-02-23 21:51:12,096 INFO [train.py:471] (3/4) Epoch 618, batch 21, global_batch_idx: 22850, batch size: 52, loss[discriminator_loss=2.574, discriminator_real_loss=1.252, discriminator_fake_loss=1.321, generator_loss=30.47, generator_mel_loss=20.27, generator_kl_loss=1.947, generator_dur_loss=1.482, generator_adv_loss=2.27, generator_feat_match_loss=4.508, over 52.00 samples.], tot_loss[discriminator_loss=2.583, discriminator_real_loss=1.309, discriminator_fake_loss=1.273, generator_loss=30.22, generator_mel_loss=20.25, generator_kl_loss=1.922, generator_dur_loss=1.479, generator_adv_loss=2.235, generator_feat_match_loss=4.332, over 1717.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:52:31,943 INFO [train.py:845] (3/4) Start epoch 619 2024-02-23 21:55:55,971 INFO [train.py:471] (3/4) Epoch 619, batch 34, global_batch_idx: 22900, batch size: 53, loss[discriminator_loss=2.566, discriminator_real_loss=1.214, discriminator_fake_loss=1.354, generator_loss=28.69, generator_mel_loss=19.15, generator_kl_loss=1.916, generator_dur_loss=1.494, generator_adv_loss=2.277, generator_feat_match_loss=3.852, over 53.00 samples.], tot_loss[discriminator_loss=2.526, discriminator_real_loss=1.274, discriminator_fake_loss=1.252, generator_loss=30.39, generator_mel_loss=20.11, generator_kl_loss=1.92, generator_dur_loss=1.483, generator_adv_loss=2.337, generator_feat_match_loss=4.535, over 2280.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 21:56:05,368 INFO [train.py:845] (3/4) Start epoch 620 2024-02-23 21:59:38,500 INFO [train.py:845] (3/4) Start epoch 621 2024-02-23 22:00:50,360 INFO [train.py:471] (3/4) Epoch 621, batch 10, global_batch_idx: 22950, batch size: 73, loss[discriminator_loss=2.629, discriminator_real_loss=1.277, discriminator_fake_loss=1.352, generator_loss=29.7, generator_mel_loss=19.91, generator_kl_loss=1.982, generator_dur_loss=1.483, generator_adv_loss=2.201, generator_feat_match_loss=4.117, over 73.00 samples.], tot_loss[discriminator_loss=2.594, discriminator_real_loss=1.302, discriminator_fake_loss=1.292, generator_loss=30.23, generator_mel_loss=20.49, generator_kl_loss=1.95, generator_dur_loss=1.477, generator_adv_loss=2.13, generator_feat_match_loss=4.181, over 824.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:03:05,517 INFO [train.py:845] (3/4) Start epoch 622 2024-02-23 22:05:14,415 INFO [train.py:471] (3/4) Epoch 622, batch 23, global_batch_idx: 23000, batch size: 95, loss[discriminator_loss=2.707, discriminator_real_loss=1.551, discriminator_fake_loss=1.156, generator_loss=28.83, generator_mel_loss=19.97, generator_kl_loss=1.842, generator_dur_loss=1.475, generator_adv_loss=1.953, generator_feat_match_loss=3.594, over 95.00 samples.], tot_loss[discriminator_loss=2.589, discriminator_real_loss=1.324, discriminator_fake_loss=1.265, generator_loss=29.92, generator_mel_loss=19.99, generator_kl_loss=1.875, generator_dur_loss=1.48, generator_adv_loss=2.254, generator_feat_match_loss=4.323, over 1715.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:05:14,417 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 22:05:21,927 INFO [train.py:534] (3/4) Epoch 622, validation: discriminator_loss=2.67, discriminator_real_loss=1.258, discriminator_fake_loss=1.413, generator_loss=29.98, generator_mel_loss=20.83, generator_kl_loss=1.95, generator_dur_loss=1.49, generator_adv_loss=1.909, generator_feat_match_loss=3.807, over 100.00 samples. 2024-02-23 22:05:21,927 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 22:06:41,013 INFO [train.py:845] (3/4) Start epoch 623 2024-02-23 22:10:06,700 INFO [train.py:471] (3/4) Epoch 623, batch 36, global_batch_idx: 23050, batch size: 51, loss[discriminator_loss=2.65, discriminator_real_loss=1.417, discriminator_fake_loss=1.233, generator_loss=30.62, generator_mel_loss=20.43, generator_kl_loss=1.983, generator_dur_loss=1.488, generator_adv_loss=2.266, generator_feat_match_loss=4.449, over 51.00 samples.], tot_loss[discriminator_loss=2.594, discriminator_real_loss=1.307, discriminator_fake_loss=1.287, generator_loss=30.06, generator_mel_loss=20.3, generator_kl_loss=1.934, generator_dur_loss=1.477, generator_adv_loss=2.166, generator_feat_match_loss=4.187, over 2702.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:10:07,164 INFO [train.py:845] (3/4) Start epoch 624 2024-02-23 22:13:30,572 INFO [train.py:845] (3/4) Start epoch 625 2024-02-23 22:14:46,671 INFO [train.py:471] (3/4) Epoch 625, batch 12, global_batch_idx: 23100, batch size: 154, loss[discriminator_loss=2.393, discriminator_real_loss=1.085, discriminator_fake_loss=1.308, generator_loss=30.31, generator_mel_loss=19.91, generator_kl_loss=1.931, generator_dur_loss=1.448, generator_adv_loss=2.275, generator_feat_match_loss=4.742, over 154.00 samples.], tot_loss[discriminator_loss=2.52, discriminator_real_loss=1.236, discriminator_fake_loss=1.285, generator_loss=30.02, generator_mel_loss=19.89, generator_kl_loss=1.92, generator_dur_loss=1.473, generator_adv_loss=2.222, generator_feat_match_loss=4.518, over 1006.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:16:59,461 INFO [train.py:845] (3/4) Start epoch 626 2024-02-23 22:19:23,216 INFO [train.py:471] (3/4) Epoch 626, batch 25, global_batch_idx: 23150, batch size: 56, loss[discriminator_loss=2.73, discriminator_real_loss=1.549, discriminator_fake_loss=1.181, generator_loss=30.26, generator_mel_loss=20.39, generator_kl_loss=1.953, generator_dur_loss=1.48, generator_adv_loss=2.332, generator_feat_match_loss=4.102, over 56.00 samples.], tot_loss[discriminator_loss=2.612, discriminator_real_loss=1.327, discriminator_fake_loss=1.285, generator_loss=29.94, generator_mel_loss=20.28, generator_kl_loss=1.931, generator_dur_loss=1.481, generator_adv_loss=2.174, generator_feat_match_loss=4.073, over 1809.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:20:26,415 INFO [train.py:845] (3/4) Start epoch 627 2024-02-23 22:24:00,731 INFO [train.py:845] (3/4) Start epoch 628 2024-02-23 22:24:20,793 INFO [train.py:471] (3/4) Epoch 628, batch 1, global_batch_idx: 23200, batch size: 153, loss[discriminator_loss=2.762, discriminator_real_loss=1.19, discriminator_fake_loss=1.57, generator_loss=29.51, generator_mel_loss=20.28, generator_kl_loss=1.836, generator_dur_loss=1.461, generator_adv_loss=2.014, generator_feat_match_loss=3.924, over 153.00 samples.], tot_loss[discriminator_loss=2.705, discriminator_real_loss=1.154, discriminator_fake_loss=1.55, generator_loss=29.9, generator_mel_loss=20.37, generator_kl_loss=1.866, generator_dur_loss=1.468, generator_adv_loss=2.035, generator_feat_match_loss=4.161, over 213.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2024-02-23 22:24:20,794 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 22:24:29,207 INFO [train.py:534] (3/4) Epoch 628, validation: discriminator_loss=2.579, discriminator_real_loss=1.18, discriminator_fake_loss=1.399, generator_loss=30.14, generator_mel_loss=20.74, generator_kl_loss=2.018, generator_dur_loss=1.475, generator_adv_loss=1.915, generator_feat_match_loss=3.996, over 100.00 samples. 2024-02-23 22:24:29,208 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 22:27:33,453 INFO [train.py:845] (3/4) Start epoch 629 2024-02-23 22:29:01,801 INFO [train.py:471] (3/4) Epoch 629, batch 14, global_batch_idx: 23250, batch size: 73, loss[discriminator_loss=2.414, discriminator_real_loss=1.152, discriminator_fake_loss=1.263, generator_loss=30.5, generator_mel_loss=20.02, generator_kl_loss=1.968, generator_dur_loss=1.469, generator_adv_loss=2.393, generator_feat_match_loss=4.656, over 73.00 samples.], tot_loss[discriminator_loss=2.483, discriminator_real_loss=1.247, discriminator_fake_loss=1.236, generator_loss=30.43, generator_mel_loss=20.21, generator_kl_loss=1.921, generator_dur_loss=1.468, generator_adv_loss=2.287, generator_feat_match_loss=4.544, over 1247.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:31:02,294 INFO [train.py:845] (3/4) Start epoch 630 2024-02-23 22:33:46,967 INFO [train.py:471] (3/4) Epoch 630, batch 27, global_batch_idx: 23300, batch size: 126, loss[discriminator_loss=2.305, discriminator_real_loss=1.211, discriminator_fake_loss=1.093, generator_loss=31.08, generator_mel_loss=20.04, generator_kl_loss=1.916, generator_dur_loss=1.489, generator_adv_loss=2.592, generator_feat_match_loss=5.039, over 126.00 samples.], tot_loss[discriminator_loss=2.525, discriminator_real_loss=1.285, discriminator_fake_loss=1.24, generator_loss=30.59, generator_mel_loss=20.18, generator_kl_loss=1.918, generator_dur_loss=1.479, generator_adv_loss=2.378, generator_feat_match_loss=4.639, over 2158.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:34:31,513 INFO [train.py:845] (3/4) Start epoch 631 2024-02-23 22:38:02,202 INFO [train.py:845] (3/4) Start epoch 632 2024-02-23 22:38:29,943 INFO [train.py:471] (3/4) Epoch 632, batch 3, global_batch_idx: 23350, batch size: 50, loss[discriminator_loss=2.463, discriminator_real_loss=1.238, discriminator_fake_loss=1.225, generator_loss=31.32, generator_mel_loss=20.66, generator_kl_loss=1.922, generator_dur_loss=1.483, generator_adv_loss=2.4, generator_feat_match_loss=4.859, over 50.00 samples.], tot_loss[discriminator_loss=2.606, discriminator_real_loss=1.34, discriminator_fake_loss=1.266, generator_loss=30.35, generator_mel_loss=20.29, generator_kl_loss=1.895, generator_dur_loss=1.47, generator_adv_loss=2.261, generator_feat_match_loss=4.44, over 281.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:41:30,025 INFO [train.py:845] (3/4) Start epoch 633 2024-02-23 22:43:14,742 INFO [train.py:471] (3/4) Epoch 633, batch 16, global_batch_idx: 23400, batch size: 65, loss[discriminator_loss=2.664, discriminator_real_loss=1.145, discriminator_fake_loss=1.519, generator_loss=29.25, generator_mel_loss=19.87, generator_kl_loss=1.867, generator_dur_loss=1.48, generator_adv_loss=2.051, generator_feat_match_loss=3.984, over 65.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.29, discriminator_fake_loss=1.281, generator_loss=30.12, generator_mel_loss=20.05, generator_kl_loss=1.929, generator_dur_loss=1.476, generator_adv_loss=2.239, generator_feat_match_loss=4.42, over 1292.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:43:14,744 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 22:43:23,011 INFO [train.py:534] (3/4) Epoch 633, validation: discriminator_loss=2.503, discriminator_real_loss=1.229, discriminator_fake_loss=1.274, generator_loss=30.52, generator_mel_loss=20.75, generator_kl_loss=2.036, generator_dur_loss=1.476, generator_adv_loss=1.971, generator_feat_match_loss=4.291, over 100.00 samples. 2024-02-23 22:43:23,012 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 22:45:04,099 INFO [train.py:845] (3/4) Start epoch 634 2024-02-23 22:47:51,073 INFO [train.py:471] (3/4) Epoch 634, batch 29, global_batch_idx: 23450, batch size: 90, loss[discriminator_loss=2.711, discriminator_real_loss=1.206, discriminator_fake_loss=1.506, generator_loss=30.04, generator_mel_loss=20.49, generator_kl_loss=1.891, generator_dur_loss=1.489, generator_adv_loss=2.256, generator_feat_match_loss=3.914, over 90.00 samples.], tot_loss[discriminator_loss=2.578, discriminator_real_loss=1.298, discriminator_fake_loss=1.28, generator_loss=30.47, generator_mel_loss=20.48, generator_kl_loss=1.942, generator_dur_loss=1.482, generator_adv_loss=2.236, generator_feat_match_loss=4.332, over 2165.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:48:33,596 INFO [train.py:845] (3/4) Start epoch 635 2024-02-23 22:52:11,405 INFO [train.py:845] (3/4) Start epoch 636 2024-02-23 22:52:48,082 INFO [train.py:471] (3/4) Epoch 636, batch 5, global_batch_idx: 23500, batch size: 67, loss[discriminator_loss=2.568, discriminator_real_loss=1.392, discriminator_fake_loss=1.177, generator_loss=29.63, generator_mel_loss=19.93, generator_kl_loss=1.907, generator_dur_loss=1.475, generator_adv_loss=2.236, generator_feat_match_loss=4.086, over 67.00 samples.], tot_loss[discriminator_loss=2.565, discriminator_real_loss=1.3, discriminator_fake_loss=1.265, generator_loss=30.16, generator_mel_loss=20.18, generator_kl_loss=1.931, generator_dur_loss=1.477, generator_adv_loss=2.23, generator_feat_match_loss=4.339, over 427.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:55:42,980 INFO [train.py:845] (3/4) Start epoch 637 2024-02-23 22:57:38,561 INFO [train.py:471] (3/4) Epoch 637, batch 18, global_batch_idx: 23550, batch size: 81, loss[discriminator_loss=2.674, discriminator_real_loss=1.289, discriminator_fake_loss=1.385, generator_loss=29.64, generator_mel_loss=19.98, generator_kl_loss=1.876, generator_dur_loss=1.48, generator_adv_loss=2.082, generator_feat_match_loss=4.227, over 81.00 samples.], tot_loss[discriminator_loss=2.562, discriminator_real_loss=1.293, discriminator_fake_loss=1.268, generator_loss=30.32, generator_mel_loss=20.11, generator_kl_loss=1.92, generator_dur_loss=1.474, generator_adv_loss=2.287, generator_feat_match_loss=4.527, over 1533.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 22:59:08,164 INFO [train.py:845] (3/4) Start epoch 638 2024-02-23 23:02:09,913 INFO [train.py:471] (3/4) Epoch 638, batch 31, global_batch_idx: 23600, batch size: 67, loss[discriminator_loss=2.65, discriminator_real_loss=1.318, discriminator_fake_loss=1.332, generator_loss=29.02, generator_mel_loss=19.48, generator_kl_loss=1.885, generator_dur_loss=1.481, generator_adv_loss=2.076, generator_feat_match_loss=4.094, over 67.00 samples.], tot_loss[discriminator_loss=2.483, discriminator_real_loss=1.253, discriminator_fake_loss=1.23, generator_loss=30.55, generator_mel_loss=20.05, generator_kl_loss=1.929, generator_dur_loss=1.48, generator_adv_loss=2.366, generator_feat_match_loss=4.731, over 2261.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 16.0 2024-02-23 23:02:09,915 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 23:02:18,498 INFO [train.py:534] (3/4) Epoch 638, validation: discriminator_loss=2.534, discriminator_real_loss=1.228, discriminator_fake_loss=1.306, generator_loss=30.89, generator_mel_loss=20.93, generator_kl_loss=1.999, generator_dur_loss=1.478, generator_adv_loss=2.078, generator_feat_match_loss=4.401, over 100.00 samples. 2024-02-23 23:02:18,499 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 23:02:44,308 INFO [train.py:845] (3/4) Start epoch 639 2024-02-23 23:06:13,874 INFO [train.py:845] (3/4) Start epoch 640 2024-02-23 23:07:06,833 INFO [train.py:471] (3/4) Epoch 640, batch 7, global_batch_idx: 23650, batch size: 64, loss[discriminator_loss=2.645, discriminator_real_loss=1.281, discriminator_fake_loss=1.363, generator_loss=29.24, generator_mel_loss=19.69, generator_kl_loss=2.01, generator_dur_loss=1.467, generator_adv_loss=2.133, generator_feat_match_loss=3.939, over 64.00 samples.], tot_loss[discriminator_loss=2.546, discriminator_real_loss=1.272, discriminator_fake_loss=1.274, generator_loss=30.39, generator_mel_loss=20.25, generator_kl_loss=1.921, generator_dur_loss=1.479, generator_adv_loss=2.303, generator_feat_match_loss=4.44, over 606.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 23:09:37,572 INFO [train.py:845] (3/4) Start epoch 641 2024-02-23 23:11:39,842 INFO [train.py:471] (3/4) Epoch 641, batch 20, global_batch_idx: 23700, batch size: 95, loss[discriminator_loss=2.465, discriminator_real_loss=1.232, discriminator_fake_loss=1.231, generator_loss=31.25, generator_mel_loss=20.01, generator_kl_loss=1.916, generator_dur_loss=1.468, generator_adv_loss=2.611, generator_feat_match_loss=5.242, over 95.00 samples.], tot_loss[discriminator_loss=2.501, discriminator_real_loss=1.248, discriminator_fake_loss=1.252, generator_loss=30.48, generator_mel_loss=20.07, generator_kl_loss=1.9, generator_dur_loss=1.476, generator_adv_loss=2.358, generator_feat_match_loss=4.672, over 1653.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 23:13:06,616 INFO [train.py:845] (3/4) Start epoch 642 2024-02-23 23:16:19,796 INFO [train.py:471] (3/4) Epoch 642, batch 33, global_batch_idx: 23750, batch size: 126, loss[discriminator_loss=2.559, discriminator_real_loss=1.416, discriminator_fake_loss=1.143, generator_loss=30.69, generator_mel_loss=20.62, generator_kl_loss=1.919, generator_dur_loss=1.454, generator_adv_loss=2.379, generator_feat_match_loss=4.324, over 126.00 samples.], tot_loss[discriminator_loss=2.559, discriminator_real_loss=1.304, discriminator_fake_loss=1.255, generator_loss=30.26, generator_mel_loss=20.03, generator_kl_loss=1.921, generator_dur_loss=1.477, generator_adv_loss=2.3, generator_feat_match_loss=4.527, over 2570.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 23:16:34,578 INFO [train.py:845] (3/4) Start epoch 643 2024-02-23 23:20:05,708 INFO [train.py:845] (3/4) Start epoch 644 2024-02-23 23:21:08,330 INFO [train.py:471] (3/4) Epoch 644, batch 9, global_batch_idx: 23800, batch size: 126, loss[discriminator_loss=2.439, discriminator_real_loss=1.29, discriminator_fake_loss=1.149, generator_loss=31.02, generator_mel_loss=20.24, generator_kl_loss=1.855, generator_dur_loss=1.474, generator_adv_loss=2.461, generator_feat_match_loss=4.984, over 126.00 samples.], tot_loss[discriminator_loss=2.514, discriminator_real_loss=1.272, discriminator_fake_loss=1.242, generator_loss=30.57, generator_mel_loss=20.08, generator_kl_loss=1.91, generator_dur_loss=1.481, generator_adv_loss=2.382, generator_feat_match_loss=4.717, over 804.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 23:21:08,331 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 23:21:16,040 INFO [train.py:534] (3/4) Epoch 644, validation: discriminator_loss=2.447, discriminator_real_loss=1.232, discriminator_fake_loss=1.215, generator_loss=30.7, generator_mel_loss=20.56, generator_kl_loss=1.99, generator_dur_loss=1.477, generator_adv_loss=2.103, generator_feat_match_loss=4.574, over 100.00 samples. 2024-02-23 23:21:16,041 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 23:23:45,819 INFO [train.py:845] (3/4) Start epoch 645 2024-02-23 23:25:57,547 INFO [train.py:471] (3/4) Epoch 645, batch 22, global_batch_idx: 23850, batch size: 52, loss[discriminator_loss=2.852, discriminator_real_loss=1.195, discriminator_fake_loss=1.656, generator_loss=29.01, generator_mel_loss=19.92, generator_kl_loss=1.857, generator_dur_loss=1.489, generator_adv_loss=1.943, generator_feat_match_loss=3.803, over 52.00 samples.], tot_loss[discriminator_loss=2.467, discriminator_real_loss=1.227, discriminator_fake_loss=1.24, generator_loss=30.72, generator_mel_loss=20.09, generator_kl_loss=1.942, generator_dur_loss=1.479, generator_adv_loss=2.396, generator_feat_match_loss=4.813, over 1692.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 23:27:12,472 INFO [train.py:845] (3/4) Start epoch 646 2024-02-23 23:30:39,015 INFO [train.py:471] (3/4) Epoch 646, batch 35, global_batch_idx: 23900, batch size: 60, loss[discriminator_loss=2.422, discriminator_real_loss=1.225, discriminator_fake_loss=1.198, generator_loss=30.27, generator_mel_loss=19.82, generator_kl_loss=1.834, generator_dur_loss=1.486, generator_adv_loss=2.326, generator_feat_match_loss=4.805, over 60.00 samples.], tot_loss[discriminator_loss=2.405, discriminator_real_loss=1.214, discriminator_fake_loss=1.192, generator_loss=30.63, generator_mel_loss=19.85, generator_kl_loss=1.922, generator_dur_loss=1.478, generator_adv_loss=2.445, generator_feat_match_loss=4.938, over 2723.00 samples.], cur_lr_g: 1.85e-04, cur_lr_d: 1.85e-04, grad_scale: 8.0 2024-02-23 23:30:43,816 INFO [train.py:845] (3/4) Start epoch 647 2024-02-23 23:34:06,100 INFO [train.py:845] (3/4) Start epoch 648 2024-02-23 23:35:16,052 INFO [train.py:471] (3/4) Epoch 648, batch 11, global_batch_idx: 23950, batch size: 51, loss[discriminator_loss=2.637, discriminator_real_loss=1.328, discriminator_fake_loss=1.308, generator_loss=30.55, generator_mel_loss=19.94, generator_kl_loss=1.898, generator_dur_loss=1.482, generator_adv_loss=2.309, generator_feat_match_loss=4.926, over 51.00 samples.], tot_loss[discriminator_loss=2.52, discriminator_real_loss=1.273, discriminator_fake_loss=1.247, generator_loss=30.33, generator_mel_loss=19.88, generator_kl_loss=1.918, generator_dur_loss=1.478, generator_adv_loss=2.37, generator_feat_match_loss=4.677, over 877.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-23 23:37:33,661 INFO [train.py:845] (3/4) Start epoch 649 2024-02-23 23:40:00,767 INFO [train.py:471] (3/4) Epoch 649, batch 24, global_batch_idx: 24000, batch size: 85, loss[discriminator_loss=2.496, discriminator_real_loss=1.331, discriminator_fake_loss=1.166, generator_loss=30.29, generator_mel_loss=19.97, generator_kl_loss=1.98, generator_dur_loss=1.476, generator_adv_loss=2.205, generator_feat_match_loss=4.656, over 85.00 samples.], tot_loss[discriminator_loss=2.452, discriminator_real_loss=1.241, discriminator_fake_loss=1.211, generator_loss=30.45, generator_mel_loss=19.78, generator_kl_loss=1.938, generator_dur_loss=1.475, generator_adv_loss=2.41, generator_feat_match_loss=4.846, over 1847.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2024-02-23 23:40:00,769 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 23:40:08,796 INFO [train.py:534] (3/4) Epoch 649, validation: discriminator_loss=2.826, discriminator_real_loss=1.135, discriminator_fake_loss=1.69, generator_loss=29.89, generator_mel_loss=20.67, generator_kl_loss=2.152, generator_dur_loss=1.476, generator_adv_loss=1.588, generator_feat_match_loss=4.01, over 100.00 samples. 2024-02-23 23:40:08,796 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-23 23:41:10,804 INFO [train.py:845] (3/4) Start epoch 650 2024-02-23 23:44:40,861 INFO [train.py:845] (3/4) Start epoch 651 2024-02-23 23:44:54,569 INFO [train.py:471] (3/4) Epoch 651, batch 0, global_batch_idx: 24050, batch size: 90, loss[discriminator_loss=2.305, discriminator_real_loss=1.26, discriminator_fake_loss=1.045, generator_loss=31.08, generator_mel_loss=20.06, generator_kl_loss=1.936, generator_dur_loss=1.454, generator_adv_loss=2.484, generator_feat_match_loss=5.148, over 90.00 samples.], tot_loss[discriminator_loss=2.305, discriminator_real_loss=1.26, discriminator_fake_loss=1.045, generator_loss=31.08, generator_mel_loss=20.06, generator_kl_loss=1.936, generator_dur_loss=1.454, generator_adv_loss=2.484, generator_feat_match_loss=5.148, over 90.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-23 23:48:13,730 INFO [train.py:845] (3/4) Start epoch 652 2024-02-23 23:49:46,689 INFO [train.py:471] (3/4) Epoch 652, batch 13, global_batch_idx: 24100, batch size: 54, loss[discriminator_loss=2.555, discriminator_real_loss=1.248, discriminator_fake_loss=1.307, generator_loss=29.3, generator_mel_loss=19.62, generator_kl_loss=1.955, generator_dur_loss=1.504, generator_adv_loss=2.264, generator_feat_match_loss=3.949, over 54.00 samples.], tot_loss[discriminator_loss=2.528, discriminator_real_loss=1.293, discriminator_fake_loss=1.234, generator_loss=30.51, generator_mel_loss=20.22, generator_kl_loss=1.934, generator_dur_loss=1.475, generator_adv_loss=2.318, generator_feat_match_loss=4.565, over 1116.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-23 23:51:48,089 INFO [train.py:845] (3/4) Start epoch 653 2024-02-23 23:54:24,168 INFO [train.py:471] (3/4) Epoch 653, batch 26, global_batch_idx: 24150, batch size: 69, loss[discriminator_loss=2.51, discriminator_real_loss=1.267, discriminator_fake_loss=1.243, generator_loss=30.32, generator_mel_loss=20.06, generator_kl_loss=1.929, generator_dur_loss=1.492, generator_adv_loss=2.188, generator_feat_match_loss=4.645, over 69.00 samples.], tot_loss[discriminator_loss=2.452, discriminator_real_loss=1.239, discriminator_fake_loss=1.213, generator_loss=30.33, generator_mel_loss=19.77, generator_kl_loss=1.899, generator_dur_loss=1.474, generator_adv_loss=2.382, generator_feat_match_loss=4.802, over 1996.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-23 23:55:19,838 INFO [train.py:845] (3/4) Start epoch 654 2024-02-23 23:58:43,448 INFO [train.py:845] (3/4) Start epoch 655 2024-02-23 23:59:06,358 INFO [train.py:471] (3/4) Epoch 655, batch 2, global_batch_idx: 24200, batch size: 65, loss[discriminator_loss=2.633, discriminator_real_loss=1.26, discriminator_fake_loss=1.373, generator_loss=29.99, generator_mel_loss=19.84, generator_kl_loss=1.907, generator_dur_loss=1.479, generator_adv_loss=2.158, generator_feat_match_loss=4.605, over 65.00 samples.], tot_loss[discriminator_loss=2.5, discriminator_real_loss=1.253, discriminator_fake_loss=1.248, generator_loss=30.04, generator_mel_loss=19.63, generator_kl_loss=1.919, generator_dur_loss=1.484, generator_adv_loss=2.338, generator_feat_match_loss=4.662, over 181.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-23 23:59:06,358 INFO [train.py:525] (3/4) Computing validation loss 2024-02-23 23:59:15,150 INFO [train.py:534] (3/4) Epoch 655, validation: discriminator_loss=2.603, discriminator_real_loss=1.198, discriminator_fake_loss=1.405, generator_loss=30.21, generator_mel_loss=20.53, generator_kl_loss=1.976, generator_dur_loss=1.479, generator_adv_loss=1.886, generator_feat_match_loss=4.341, over 100.00 samples. 2024-02-23 23:59:15,151 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 00:02:25,340 INFO [train.py:845] (3/4) Start epoch 656 2024-02-24 00:04:02,931 INFO [train.py:471] (3/4) Epoch 656, batch 15, global_batch_idx: 24250, batch size: 79, loss[discriminator_loss=2.398, discriminator_real_loss=1.192, discriminator_fake_loss=1.205, generator_loss=30.83, generator_mel_loss=19.96, generator_kl_loss=1.895, generator_dur_loss=1.467, generator_adv_loss=2.432, generator_feat_match_loss=5.078, over 79.00 samples.], tot_loss[discriminator_loss=2.498, discriminator_real_loss=1.265, discriminator_fake_loss=1.233, generator_loss=30.47, generator_mel_loss=20.01, generator_kl_loss=1.917, generator_dur_loss=1.471, generator_adv_loss=2.378, generator_feat_match_loss=4.695, over 1195.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:05:53,934 INFO [train.py:845] (3/4) Start epoch 657 2024-02-24 00:08:44,796 INFO [train.py:471] (3/4) Epoch 657, batch 28, global_batch_idx: 24300, batch size: 53, loss[discriminator_loss=2.541, discriminator_real_loss=1.217, discriminator_fake_loss=1.324, generator_loss=30.31, generator_mel_loss=19.94, generator_kl_loss=1.845, generator_dur_loss=1.507, generator_adv_loss=2.467, generator_feat_match_loss=4.559, over 53.00 samples.], tot_loss[discriminator_loss=2.466, discriminator_real_loss=1.243, discriminator_fake_loss=1.223, generator_loss=30.46, generator_mel_loss=19.86, generator_kl_loss=1.92, generator_dur_loss=1.477, generator_adv_loss=2.381, generator_feat_match_loss=4.822, over 2150.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:09:29,766 INFO [train.py:845] (3/4) Start epoch 658 2024-02-24 00:12:54,533 INFO [train.py:845] (3/4) Start epoch 659 2024-02-24 00:13:37,274 INFO [train.py:471] (3/4) Epoch 659, batch 4, global_batch_idx: 24350, batch size: 76, loss[discriminator_loss=2.393, discriminator_real_loss=1.428, discriminator_fake_loss=0.9648, generator_loss=30.62, generator_mel_loss=19.78, generator_kl_loss=1.953, generator_dur_loss=1.482, generator_adv_loss=2.553, generator_feat_match_loss=4.852, over 76.00 samples.], tot_loss[discriminator_loss=2.473, discriminator_real_loss=1.258, discriminator_fake_loss=1.216, generator_loss=30.89, generator_mel_loss=20.14, generator_kl_loss=1.923, generator_dur_loss=1.463, generator_adv_loss=2.508, generator_feat_match_loss=4.855, over 437.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:16:24,311 INFO [train.py:845] (3/4) Start epoch 660 2024-02-24 00:18:11,442 INFO [train.py:471] (3/4) Epoch 660, batch 17, global_batch_idx: 24400, batch size: 63, loss[discriminator_loss=2.27, discriminator_real_loss=1.119, discriminator_fake_loss=1.149, generator_loss=30.6, generator_mel_loss=19.09, generator_kl_loss=1.87, generator_dur_loss=1.452, generator_adv_loss=2.682, generator_feat_match_loss=5.512, over 63.00 samples.], tot_loss[discriminator_loss=2.388, discriminator_real_loss=1.209, discriminator_fake_loss=1.178, generator_loss=30.79, generator_mel_loss=19.77, generator_kl_loss=1.907, generator_dur_loss=1.471, generator_adv_loss=2.499, generator_feat_match_loss=5.147, over 1338.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2024-02-24 00:18:11,444 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 00:18:19,773 INFO [train.py:534] (3/4) Epoch 660, validation: discriminator_loss=2.242, discriminator_real_loss=1.085, discriminator_fake_loss=1.157, generator_loss=32.49, generator_mel_loss=20.58, generator_kl_loss=2.052, generator_dur_loss=1.474, generator_adv_loss=2.542, generator_feat_match_loss=5.837, over 100.00 samples. 2024-02-24 00:18:19,775 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 00:20:05,007 INFO [train.py:845] (3/4) Start epoch 661 2024-02-24 00:22:58,898 INFO [train.py:471] (3/4) Epoch 661, batch 30, global_batch_idx: 24450, batch size: 60, loss[discriminator_loss=2.383, discriminator_real_loss=1.125, discriminator_fake_loss=1.257, generator_loss=30.14, generator_mel_loss=19.6, generator_kl_loss=1.882, generator_dur_loss=1.471, generator_adv_loss=2.291, generator_feat_match_loss=4.895, over 60.00 samples.], tot_loss[discriminator_loss=2.45, discriminator_real_loss=1.225, discriminator_fake_loss=1.225, generator_loss=30.33, generator_mel_loss=19.77, generator_kl_loss=1.919, generator_dur_loss=1.478, generator_adv_loss=2.377, generator_feat_match_loss=4.781, over 2193.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:23:33,399 INFO [train.py:845] (3/4) Start epoch 662 2024-02-24 00:27:03,473 INFO [train.py:845] (3/4) Start epoch 663 2024-02-24 00:27:48,107 INFO [train.py:471] (3/4) Epoch 663, batch 6, global_batch_idx: 24500, batch size: 69, loss[discriminator_loss=2.465, discriminator_real_loss=1.277, discriminator_fake_loss=1.188, generator_loss=29.72, generator_mel_loss=19.67, generator_kl_loss=1.952, generator_dur_loss=1.464, generator_adv_loss=2.227, generator_feat_match_loss=4.406, over 69.00 samples.], tot_loss[discriminator_loss=2.428, discriminator_real_loss=1.191, discriminator_fake_loss=1.236, generator_loss=30.47, generator_mel_loss=19.94, generator_kl_loss=1.947, generator_dur_loss=1.482, generator_adv_loss=2.301, generator_feat_match_loss=4.8, over 530.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:30:30,397 INFO [train.py:845] (3/4) Start epoch 664 2024-02-24 00:32:27,798 INFO [train.py:471] (3/4) Epoch 664, batch 19, global_batch_idx: 24550, batch size: 110, loss[discriminator_loss=2.461, discriminator_real_loss=1.275, discriminator_fake_loss=1.186, generator_loss=30.42, generator_mel_loss=19.85, generator_kl_loss=1.931, generator_dur_loss=1.461, generator_adv_loss=2.539, generator_feat_match_loss=4.641, over 110.00 samples.], tot_loss[discriminator_loss=2.427, discriminator_real_loss=1.217, discriminator_fake_loss=1.21, generator_loss=30.3, generator_mel_loss=19.6, generator_kl_loss=1.935, generator_dur_loss=1.474, generator_adv_loss=2.402, generator_feat_match_loss=4.889, over 1499.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:34:00,303 INFO [train.py:845] (3/4) Start epoch 665 2024-02-24 00:37:00,930 INFO [train.py:471] (3/4) Epoch 665, batch 32, global_batch_idx: 24600, batch size: 85, loss[discriminator_loss=2.234, discriminator_real_loss=1.169, discriminator_fake_loss=1.064, generator_loss=31.14, generator_mel_loss=19.58, generator_kl_loss=1.969, generator_dur_loss=1.457, generator_adv_loss=2.645, generator_feat_match_loss=5.488, over 85.00 samples.], tot_loss[discriminator_loss=2.41, discriminator_real_loss=1.205, discriminator_fake_loss=1.205, generator_loss=30.64, generator_mel_loss=19.79, generator_kl_loss=1.941, generator_dur_loss=1.476, generator_adv_loss=2.439, generator_feat_match_loss=4.998, over 2321.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:37:00,932 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 00:37:10,036 INFO [train.py:534] (3/4) Epoch 665, validation: discriminator_loss=2.248, discriminator_real_loss=1.099, discriminator_fake_loss=1.148, generator_loss=31.86, generator_mel_loss=20.2, generator_kl_loss=1.947, generator_dur_loss=1.474, generator_adv_loss=2.509, generator_feat_match_loss=5.722, over 100.00 samples. 2024-02-24 00:37:10,037 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 00:37:33,034 INFO [train.py:845] (3/4) Start epoch 666 2024-02-24 00:41:05,876 INFO [train.py:845] (3/4) Start epoch 667 2024-02-24 00:42:03,414 INFO [train.py:471] (3/4) Epoch 667, batch 8, global_batch_idx: 24650, batch size: 61, loss[discriminator_loss=2.281, discriminator_real_loss=1.082, discriminator_fake_loss=1.2, generator_loss=31.65, generator_mel_loss=19.84, generator_kl_loss=1.893, generator_dur_loss=1.478, generator_adv_loss=2.535, generator_feat_match_loss=5.902, over 61.00 samples.], tot_loss[discriminator_loss=2.406, discriminator_real_loss=1.195, discriminator_fake_loss=1.212, generator_loss=30.8, generator_mel_loss=19.73, generator_kl_loss=1.91, generator_dur_loss=1.471, generator_adv_loss=2.439, generator_feat_match_loss=5.255, over 660.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:44:32,938 INFO [train.py:845] (3/4) Start epoch 668 2024-02-24 00:46:36,381 INFO [train.py:471] (3/4) Epoch 668, batch 21, global_batch_idx: 24700, batch size: 85, loss[discriminator_loss=2.445, discriminator_real_loss=1.114, discriminator_fake_loss=1.332, generator_loss=30.15, generator_mel_loss=19.83, generator_kl_loss=1.947, generator_dur_loss=1.466, generator_adv_loss=2.207, generator_feat_match_loss=4.703, over 85.00 samples.], tot_loss[discriminator_loss=2.399, discriminator_real_loss=1.208, discriminator_fake_loss=1.191, generator_loss=30.47, generator_mel_loss=19.59, generator_kl_loss=1.896, generator_dur_loss=1.472, generator_adv_loss=2.456, generator_feat_match_loss=5.062, over 1586.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:48:05,635 INFO [train.py:845] (3/4) Start epoch 669 2024-02-24 00:51:19,300 INFO [train.py:471] (3/4) Epoch 669, batch 34, global_batch_idx: 24750, batch size: 79, loss[discriminator_loss=2.68, discriminator_real_loss=1.231, discriminator_fake_loss=1.447, generator_loss=30.9, generator_mel_loss=20.06, generator_kl_loss=1.926, generator_dur_loss=1.491, generator_adv_loss=2.627, generator_feat_match_loss=4.793, over 79.00 samples.], tot_loss[discriminator_loss=2.64, discriminator_real_loss=1.362, discriminator_fake_loss=1.278, generator_loss=30.47, generator_mel_loss=20.18, generator_kl_loss=1.94, generator_dur_loss=1.479, generator_adv_loss=2.358, generator_feat_match_loss=4.522, over 2379.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 00:51:32,130 INFO [train.py:845] (3/4) Start epoch 670 2024-02-24 00:54:58,823 INFO [train.py:845] (3/4) Start epoch 671 2024-02-24 00:56:06,432 INFO [train.py:471] (3/4) Epoch 671, batch 10, global_batch_idx: 24800, batch size: 64, loss[discriminator_loss=2.793, discriminator_real_loss=1.42, discriminator_fake_loss=1.374, generator_loss=27.39, generator_mel_loss=19, generator_kl_loss=1.847, generator_dur_loss=1.481, generator_adv_loss=1.818, generator_feat_match_loss=3.252, over 64.00 samples.], tot_loss[discriminator_loss=2.756, discriminator_real_loss=1.427, discriminator_fake_loss=1.329, generator_loss=27.97, generator_mel_loss=19.29, generator_kl_loss=1.919, generator_dur_loss=1.476, generator_adv_loss=1.898, generator_feat_match_loss=3.38, over 757.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 16.0 2024-02-24 00:56:06,433 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 00:56:14,207 INFO [train.py:534] (3/4) Epoch 671, validation: discriminator_loss=2.773, discriminator_real_loss=1.395, discriminator_fake_loss=1.378, generator_loss=29.3, generator_mel_loss=20.4, generator_kl_loss=2.083, generator_dur_loss=1.47, generator_adv_loss=1.803, generator_feat_match_loss=3.54, over 100.00 samples. 2024-02-24 00:56:14,208 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 00:58:35,540 INFO [train.py:845] (3/4) Start epoch 672 2024-02-24 01:00:52,631 INFO [train.py:471] (3/4) Epoch 672, batch 23, global_batch_idx: 24850, batch size: 59, loss[discriminator_loss=2.572, discriminator_real_loss=1.225, discriminator_fake_loss=1.348, generator_loss=30.02, generator_mel_loss=19.86, generator_kl_loss=1.869, generator_dur_loss=1.48, generator_adv_loss=2.418, generator_feat_match_loss=4.395, over 59.00 samples.], tot_loss[discriminator_loss=2.699, discriminator_real_loss=1.379, discriminator_fake_loss=1.32, generator_loss=29.13, generator_mel_loss=19.75, generator_kl_loss=1.918, generator_dur_loss=1.469, generator_adv_loss=2.043, generator_feat_match_loss=3.956, over 1830.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:02:02,389 INFO [train.py:845] (3/4) Start epoch 673 2024-02-24 01:05:32,298 INFO [train.py:471] (3/4) Epoch 673, batch 36, global_batch_idx: 24900, batch size: 79, loss[discriminator_loss=2.645, discriminator_real_loss=1.294, discriminator_fake_loss=1.352, generator_loss=29.42, generator_mel_loss=20.12, generator_kl_loss=1.93, generator_dur_loss=1.475, generator_adv_loss=2.08, generator_feat_match_loss=3.809, over 79.00 samples.], tot_loss[discriminator_loss=2.693, discriminator_real_loss=1.362, discriminator_fake_loss=1.332, generator_loss=29.04, generator_mel_loss=19.81, generator_kl_loss=1.913, generator_dur_loss=1.477, generator_adv_loss=1.997, generator_feat_match_loss=3.836, over 2586.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:05:32,709 INFO [train.py:845] (3/4) Start epoch 674 2024-02-24 01:08:58,009 INFO [train.py:845] (3/4) Start epoch 675 2024-02-24 01:10:12,805 INFO [train.py:471] (3/4) Epoch 675, batch 12, global_batch_idx: 24950, batch size: 95, loss[discriminator_loss=2.613, discriminator_real_loss=1.33, discriminator_fake_loss=1.283, generator_loss=30.37, generator_mel_loss=20.13, generator_kl_loss=1.962, generator_dur_loss=1.451, generator_adv_loss=2.074, generator_feat_match_loss=4.75, over 95.00 samples.], tot_loss[discriminator_loss=2.757, discriminator_real_loss=1.415, discriminator_fake_loss=1.342, generator_loss=30.09, generator_mel_loss=20.19, generator_kl_loss=1.954, generator_dur_loss=1.469, generator_adv_loss=2.217, generator_feat_match_loss=4.265, over 1007.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:12:33,828 INFO [train.py:845] (3/4) Start epoch 676 2024-02-24 01:14:59,692 INFO [train.py:471] (3/4) Epoch 676, batch 25, global_batch_idx: 25000, batch size: 153, loss[discriminator_loss=2.633, discriminator_real_loss=1.323, discriminator_fake_loss=1.309, generator_loss=30.49, generator_mel_loss=20.43, generator_kl_loss=2.052, generator_dur_loss=1.48, generator_adv_loss=2.221, generator_feat_match_loss=4.305, over 153.00 samples.], tot_loss[discriminator_loss=2.611, discriminator_real_loss=1.32, discriminator_fake_loss=1.292, generator_loss=29.64, generator_mel_loss=20.05, generator_kl_loss=1.958, generator_dur_loss=1.48, generator_adv_loss=2.09, generator_feat_match_loss=4.058, over 1803.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:14:59,694 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 01:15:07,371 INFO [train.py:534] (3/4) Epoch 676, validation: discriminator_loss=2.597, discriminator_real_loss=1.37, discriminator_fake_loss=1.227, generator_loss=30.54, generator_mel_loss=20.51, generator_kl_loss=2.108, generator_dur_loss=1.471, generator_adv_loss=2.208, generator_feat_match_loss=4.246, over 100.00 samples. 2024-02-24 01:15:07,372 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 01:16:10,138 INFO [train.py:845] (3/4) Start epoch 677 2024-02-24 01:19:37,715 INFO [train.py:845] (3/4) Start epoch 678 2024-02-24 01:19:56,701 INFO [train.py:471] (3/4) Epoch 678, batch 1, global_batch_idx: 25050, batch size: 60, loss[discriminator_loss=2.736, discriminator_real_loss=1.484, discriminator_fake_loss=1.252, generator_loss=29.88, generator_mel_loss=20.14, generator_kl_loss=1.885, generator_dur_loss=1.486, generator_adv_loss=2.34, generator_feat_match_loss=4.027, over 60.00 samples.], tot_loss[discriminator_loss=2.794, discriminator_real_loss=1.474, discriminator_fake_loss=1.32, generator_loss=30.02, generator_mel_loss=20.34, generator_kl_loss=1.878, generator_dur_loss=1.477, generator_adv_loss=2.304, generator_feat_match_loss=4.015, over 170.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:23:00,100 INFO [train.py:845] (3/4) Start epoch 679 2024-02-24 01:24:36,081 INFO [train.py:471] (3/4) Epoch 679, batch 14, global_batch_idx: 25100, batch size: 73, loss[discriminator_loss=2.609, discriminator_real_loss=1.298, discriminator_fake_loss=1.311, generator_loss=30.19, generator_mel_loss=20.45, generator_kl_loss=1.979, generator_dur_loss=1.469, generator_adv_loss=2.143, generator_feat_match_loss=4.145, over 73.00 samples.], tot_loss[discriminator_loss=2.645, discriminator_real_loss=1.333, discriminator_fake_loss=1.313, generator_loss=29.77, generator_mel_loss=20.23, generator_kl_loss=1.95, generator_dur_loss=1.47, generator_adv_loss=2.05, generator_feat_match_loss=4.068, over 1347.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:26:32,489 INFO [train.py:845] (3/4) Start epoch 680 2024-02-24 01:29:06,945 INFO [train.py:471] (3/4) Epoch 680, batch 27, global_batch_idx: 25150, batch size: 55, loss[discriminator_loss=2.635, discriminator_real_loss=1.198, discriminator_fake_loss=1.437, generator_loss=28.96, generator_mel_loss=19.66, generator_kl_loss=1.89, generator_dur_loss=1.512, generator_adv_loss=2.115, generator_feat_match_loss=3.781, over 55.00 samples.], tot_loss[discriminator_loss=2.642, discriminator_real_loss=1.329, discriminator_fake_loss=1.313, generator_loss=29.82, generator_mel_loss=20.18, generator_kl_loss=1.9, generator_dur_loss=1.472, generator_adv_loss=2.121, generator_feat_match_loss=4.151, over 1966.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:29:53,282 INFO [train.py:845] (3/4) Start epoch 681 2024-02-24 01:33:21,655 INFO [train.py:845] (3/4) Start epoch 682 2024-02-24 01:33:52,777 INFO [train.py:471] (3/4) Epoch 682, batch 3, global_batch_idx: 25200, batch size: 55, loss[discriminator_loss=2.852, discriminator_real_loss=1.384, discriminator_fake_loss=1.469, generator_loss=29.27, generator_mel_loss=20.22, generator_kl_loss=1.974, generator_dur_loss=1.504, generator_adv_loss=1.957, generator_feat_match_loss=3.607, over 55.00 samples.], tot_loss[discriminator_loss=2.909, discriminator_real_loss=1.465, discriminator_fake_loss=1.444, generator_loss=30.28, generator_mel_loss=20.64, generator_kl_loss=1.905, generator_dur_loss=1.468, generator_adv_loss=2.047, generator_feat_match_loss=4.224, over 378.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:33:52,778 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 01:34:01,955 INFO [train.py:534] (3/4) Epoch 682, validation: discriminator_loss=2.722, discriminator_real_loss=1.403, discriminator_fake_loss=1.319, generator_loss=31.15, generator_mel_loss=21.48, generator_kl_loss=2.031, generator_dur_loss=1.478, generator_adv_loss=2.106, generator_feat_match_loss=4.049, over 100.00 samples. 2024-02-24 01:34:01,955 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 01:36:53,983 INFO [train.py:845] (3/4) Start epoch 683 2024-02-24 01:38:37,244 INFO [train.py:471] (3/4) Epoch 683, batch 16, global_batch_idx: 25250, batch size: 54, loss[discriminator_loss=2.574, discriminator_real_loss=1.414, discriminator_fake_loss=1.161, generator_loss=29.5, generator_mel_loss=19.63, generator_kl_loss=1.896, generator_dur_loss=1.467, generator_adv_loss=2.281, generator_feat_match_loss=4.227, over 54.00 samples.], tot_loss[discriminator_loss=2.592, discriminator_real_loss=1.323, discriminator_fake_loss=1.269, generator_loss=29.95, generator_mel_loss=20.04, generator_kl_loss=1.92, generator_dur_loss=1.477, generator_adv_loss=2.201, generator_feat_match_loss=4.309, over 1083.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 4.0 2024-02-24 01:40:22,738 INFO [train.py:845] (3/4) Start epoch 684 2024-02-24 01:43:13,590 INFO [train.py:471] (3/4) Epoch 684, batch 29, global_batch_idx: 25300, batch size: 63, loss[discriminator_loss=2.65, discriminator_real_loss=1.316, discriminator_fake_loss=1.334, generator_loss=29.71, generator_mel_loss=20.14, generator_kl_loss=1.965, generator_dur_loss=1.488, generator_adv_loss=2.029, generator_feat_match_loss=4.09, over 63.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.329, discriminator_fake_loss=1.298, generator_loss=30.04, generator_mel_loss=20.2, generator_kl_loss=1.92, generator_dur_loss=1.474, generator_adv_loss=2.186, generator_feat_match_loss=4.264, over 2084.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:43:51,336 INFO [train.py:845] (3/4) Start epoch 685 2024-02-24 01:47:23,343 INFO [train.py:845] (3/4) Start epoch 686 2024-02-24 01:48:11,007 INFO [train.py:471] (3/4) Epoch 686, batch 5, global_batch_idx: 25350, batch size: 65, loss[discriminator_loss=2.465, discriminator_real_loss=1.279, discriminator_fake_loss=1.187, generator_loss=30.68, generator_mel_loss=19.99, generator_kl_loss=1.806, generator_dur_loss=1.515, generator_adv_loss=2.496, generator_feat_match_loss=4.867, over 65.00 samples.], tot_loss[discriminator_loss=2.581, discriminator_real_loss=1.267, discriminator_fake_loss=1.313, generator_loss=30.63, generator_mel_loss=20.24, generator_kl_loss=1.944, generator_dur_loss=1.491, generator_adv_loss=2.377, generator_feat_match_loss=4.581, over 438.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 4.0 2024-02-24 01:50:52,906 INFO [train.py:845] (3/4) Start epoch 687 2024-02-24 01:52:44,411 INFO [train.py:471] (3/4) Epoch 687, batch 18, global_batch_idx: 25400, batch size: 73, loss[discriminator_loss=2.547, discriminator_real_loss=1.318, discriminator_fake_loss=1.229, generator_loss=30.43, generator_mel_loss=20.63, generator_kl_loss=1.908, generator_dur_loss=1.468, generator_adv_loss=2.146, generator_feat_match_loss=4.285, over 73.00 samples.], tot_loss[discriminator_loss=2.588, discriminator_real_loss=1.301, discriminator_fake_loss=1.288, generator_loss=29.78, generator_mel_loss=20.09, generator_kl_loss=1.924, generator_dur_loss=1.472, generator_adv_loss=2.1, generator_feat_match_loss=4.195, over 1378.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:52:44,412 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 01:52:52,988 INFO [train.py:534] (3/4) Epoch 687, validation: discriminator_loss=2.567, discriminator_real_loss=1.235, discriminator_fake_loss=1.332, generator_loss=30.84, generator_mel_loss=20.99, generator_kl_loss=2.061, generator_dur_loss=1.475, generator_adv_loss=1.95, generator_feat_match_loss=4.365, over 100.00 samples. 2024-02-24 01:52:52,989 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 01:54:32,329 INFO [train.py:845] (3/4) Start epoch 688 2024-02-24 01:57:32,917 INFO [train.py:471] (3/4) Epoch 688, batch 31, global_batch_idx: 25450, batch size: 76, loss[discriminator_loss=2.738, discriminator_real_loss=1.397, discriminator_fake_loss=1.342, generator_loss=29.5, generator_mel_loss=20.04, generator_kl_loss=1.972, generator_dur_loss=1.463, generator_adv_loss=2.129, generator_feat_match_loss=3.891, over 76.00 samples.], tot_loss[discriminator_loss=2.682, discriminator_real_loss=1.35, discriminator_fake_loss=1.332, generator_loss=29.87, generator_mel_loss=20.3, generator_kl_loss=1.92, generator_dur_loss=1.475, generator_adv_loss=2.089, generator_feat_match_loss=4.091, over 2260.00 samples.], cur_lr_g: 1.84e-04, cur_lr_d: 1.84e-04, grad_scale: 8.0 2024-02-24 01:58:02,482 INFO [train.py:845] (3/4) Start epoch 689 2024-02-24 02:01:27,731 INFO [train.py:845] (3/4) Start epoch 690 2024-02-24 02:02:18,864 INFO [train.py:471] (3/4) Epoch 690, batch 7, global_batch_idx: 25500, batch size: 65, loss[discriminator_loss=2.521, discriminator_real_loss=1.38, discriminator_fake_loss=1.142, generator_loss=30.98, generator_mel_loss=20.5, generator_kl_loss=1.919, generator_dur_loss=1.485, generator_adv_loss=2.422, generator_feat_match_loss=4.656, over 65.00 samples.], tot_loss[discriminator_loss=2.601, discriminator_real_loss=1.301, discriminator_fake_loss=1.3, generator_loss=30.44, generator_mel_loss=20.25, generator_kl_loss=1.973, generator_dur_loss=1.477, generator_adv_loss=2.296, generator_feat_match_loss=4.449, over 548.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 02:05:00,051 INFO [train.py:845] (3/4) Start epoch 691 2024-02-24 02:07:01,622 INFO [train.py:471] (3/4) Epoch 691, batch 20, global_batch_idx: 25550, batch size: 52, loss[discriminator_loss=2.465, discriminator_real_loss=1.3, discriminator_fake_loss=1.166, generator_loss=30.85, generator_mel_loss=20.15, generator_kl_loss=1.892, generator_dur_loss=1.484, generator_adv_loss=2.43, generator_feat_match_loss=4.898, over 52.00 samples.], tot_loss[discriminator_loss=2.576, discriminator_real_loss=1.291, discriminator_fake_loss=1.285, generator_loss=30.52, generator_mel_loss=20.25, generator_kl_loss=1.936, generator_dur_loss=1.478, generator_adv_loss=2.338, generator_feat_match_loss=4.521, over 1415.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 02:08:31,191 INFO [train.py:845] (3/4) Start epoch 692 2024-02-24 02:11:45,048 INFO [train.py:471] (3/4) Epoch 692, batch 33, global_batch_idx: 25600, batch size: 81, loss[discriminator_loss=2.523, discriminator_real_loss=1.275, discriminator_fake_loss=1.248, generator_loss=29.81, generator_mel_loss=20.08, generator_kl_loss=1.936, generator_dur_loss=1.46, generator_adv_loss=2.154, generator_feat_match_loss=4.18, over 81.00 samples.], tot_loss[discriminator_loss=2.592, discriminator_real_loss=1.314, discriminator_fake_loss=1.277, generator_loss=29.98, generator_mel_loss=20.14, generator_kl_loss=1.935, generator_dur_loss=1.472, generator_adv_loss=2.153, generator_feat_match_loss=4.276, over 2568.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 02:11:45,050 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 02:11:54,227 INFO [train.py:534] (3/4) Epoch 692, validation: discriminator_loss=2.496, discriminator_real_loss=1.111, discriminator_fake_loss=1.385, generator_loss=30.57, generator_mel_loss=20.74, generator_kl_loss=2.04, generator_dur_loss=1.475, generator_adv_loss=1.97, generator_feat_match_loss=4.353, over 100.00 samples. 2024-02-24 02:11:54,228 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 02:12:06,587 INFO [train.py:845] (3/4) Start epoch 693 2024-02-24 02:15:32,822 INFO [train.py:845] (3/4) Start epoch 694 2024-02-24 02:16:33,515 INFO [train.py:471] (3/4) Epoch 694, batch 9, global_batch_idx: 25650, batch size: 69, loss[discriminator_loss=2.742, discriminator_real_loss=1.419, discriminator_fake_loss=1.322, generator_loss=29.3, generator_mel_loss=19.81, generator_kl_loss=1.826, generator_dur_loss=1.467, generator_adv_loss=2.195, generator_feat_match_loss=4.004, over 69.00 samples.], tot_loss[discriminator_loss=2.516, discriminator_real_loss=1.284, discriminator_fake_loss=1.231, generator_loss=30.11, generator_mel_loss=19.93, generator_kl_loss=1.888, generator_dur_loss=1.486, generator_adv_loss=2.234, generator_feat_match_loss=4.568, over 607.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 02:19:00,844 INFO [train.py:845] (3/4) Start epoch 695 2024-02-24 02:21:13,843 INFO [train.py:471] (3/4) Epoch 695, batch 22, global_batch_idx: 25700, batch size: 53, loss[discriminator_loss=2.549, discriminator_real_loss=1.215, discriminator_fake_loss=1.334, generator_loss=30.25, generator_mel_loss=20.46, generator_kl_loss=1.842, generator_dur_loss=1.522, generator_adv_loss=2.193, generator_feat_match_loss=4.234, over 53.00 samples.], tot_loss[discriminator_loss=2.616, discriminator_real_loss=1.327, discriminator_fake_loss=1.289, generator_loss=29.93, generator_mel_loss=20.21, generator_kl_loss=1.938, generator_dur_loss=1.48, generator_adv_loss=2.113, generator_feat_match_loss=4.189, over 1715.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 02:22:26,410 INFO [train.py:845] (3/4) Start epoch 696 2024-02-24 02:25:38,243 INFO [train.py:471] (3/4) Epoch 696, batch 35, global_batch_idx: 25750, batch size: 51, loss[discriminator_loss=2.553, discriminator_real_loss=1.135, discriminator_fake_loss=1.418, generator_loss=30.55, generator_mel_loss=20.33, generator_kl_loss=1.926, generator_dur_loss=1.497, generator_adv_loss=2.332, generator_feat_match_loss=4.469, over 51.00 samples.], tot_loss[discriminator_loss=2.598, discriminator_real_loss=1.298, discriminator_fake_loss=1.3, generator_loss=30.44, generator_mel_loss=20.33, generator_kl_loss=1.907, generator_dur_loss=1.474, generator_adv_loss=2.25, generator_feat_match_loss=4.487, over 2710.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 02:25:46,346 INFO [train.py:845] (3/4) Start epoch 697 2024-02-24 02:29:16,778 INFO [train.py:845] (3/4) Start epoch 698 2024-02-24 02:30:36,823 INFO [train.py:471] (3/4) Epoch 698, batch 11, global_batch_idx: 25800, batch size: 61, loss[discriminator_loss=2.594, discriminator_real_loss=1.254, discriminator_fake_loss=1.34, generator_loss=29.96, generator_mel_loss=20.24, generator_kl_loss=1.94, generator_dur_loss=1.478, generator_adv_loss=2.105, generator_feat_match_loss=4.191, over 61.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.283, discriminator_fake_loss=1.302, generator_loss=29.92, generator_mel_loss=20.16, generator_kl_loss=1.922, generator_dur_loss=1.476, generator_adv_loss=2.115, generator_feat_match_loss=4.249, over 863.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 02:30:36,825 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 02:30:45,281 INFO [train.py:534] (3/4) Epoch 698, validation: discriminator_loss=2.531, discriminator_real_loss=1.305, discriminator_fake_loss=1.226, generator_loss=30.77, generator_mel_loss=20.74, generator_kl_loss=2.004, generator_dur_loss=1.481, generator_adv_loss=2.182, generator_feat_match_loss=4.356, over 100.00 samples. 2024-02-24 02:30:45,281 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 02:32:56,626 INFO [train.py:845] (3/4) Start epoch 699 2024-02-24 02:35:12,078 INFO [train.py:471] (3/4) Epoch 699, batch 24, global_batch_idx: 25850, batch size: 61, loss[discriminator_loss=2.631, discriminator_real_loss=1.453, discriminator_fake_loss=1.178, generator_loss=30.52, generator_mel_loss=20.17, generator_kl_loss=1.922, generator_dur_loss=1.479, generator_adv_loss=2.326, generator_feat_match_loss=4.625, over 61.00 samples.], tot_loss[discriminator_loss=2.605, discriminator_real_loss=1.339, discriminator_fake_loss=1.266, generator_loss=30.35, generator_mel_loss=20.17, generator_kl_loss=1.932, generator_dur_loss=1.471, generator_adv_loss=2.283, generator_feat_match_loss=4.502, over 1863.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 02:36:22,750 INFO [train.py:845] (3/4) Start epoch 700 2024-02-24 02:39:49,212 INFO [train.py:845] (3/4) Start epoch 701 2024-02-24 02:40:02,137 INFO [train.py:471] (3/4) Epoch 701, batch 0, global_batch_idx: 25900, batch size: 65, loss[discriminator_loss=2.703, discriminator_real_loss=1.433, discriminator_fake_loss=1.271, generator_loss=29.45, generator_mel_loss=20.2, generator_kl_loss=1.862, generator_dur_loss=1.497, generator_adv_loss=1.95, generator_feat_match_loss=3.943, over 65.00 samples.], tot_loss[discriminator_loss=2.703, discriminator_real_loss=1.433, discriminator_fake_loss=1.271, generator_loss=29.45, generator_mel_loss=20.2, generator_kl_loss=1.862, generator_dur_loss=1.497, generator_adv_loss=1.95, generator_feat_match_loss=3.943, over 65.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 02:43:18,403 INFO [train.py:845] (3/4) Start epoch 702 2024-02-24 02:44:43,441 INFO [train.py:471] (3/4) Epoch 702, batch 13, global_batch_idx: 25950, batch size: 56, loss[discriminator_loss=2.711, discriminator_real_loss=1.154, discriminator_fake_loss=1.556, generator_loss=30.15, generator_mel_loss=20.27, generator_kl_loss=1.884, generator_dur_loss=1.484, generator_adv_loss=2.18, generator_feat_match_loss=4.324, over 56.00 samples.], tot_loss[discriminator_loss=2.658, discriminator_real_loss=1.329, discriminator_fake_loss=1.33, generator_loss=30.24, generator_mel_loss=20.42, generator_kl_loss=1.929, generator_dur_loss=1.467, generator_adv_loss=2.156, generator_feat_match_loss=4.276, over 1075.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 02:46:48,090 INFO [train.py:845] (3/4) Start epoch 703 2024-02-24 02:49:24,985 INFO [train.py:471] (3/4) Epoch 703, batch 26, global_batch_idx: 26000, batch size: 56, loss[discriminator_loss=2.633, discriminator_real_loss=1.261, discriminator_fake_loss=1.371, generator_loss=29.63, generator_mel_loss=19.92, generator_kl_loss=1.932, generator_dur_loss=1.487, generator_adv_loss=2.09, generator_feat_match_loss=4.203, over 56.00 samples.], tot_loss[discriminator_loss=2.561, discriminator_real_loss=1.294, discriminator_fake_loss=1.268, generator_loss=30.45, generator_mel_loss=20.21, generator_kl_loss=1.932, generator_dur_loss=1.472, generator_adv_loss=2.278, generator_feat_match_loss=4.557, over 1825.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 02:49:24,986 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 02:49:33,688 INFO [train.py:534] (3/4) Epoch 703, validation: discriminator_loss=2.517, discriminator_real_loss=1.243, discriminator_fake_loss=1.275, generator_loss=31.21, generator_mel_loss=20.55, generator_kl_loss=2.055, generator_dur_loss=1.477, generator_adv_loss=2.355, generator_feat_match_loss=4.774, over 100.00 samples. 2024-02-24 02:49:33,690 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 02:50:34,689 INFO [train.py:845] (3/4) Start epoch 704 2024-02-24 02:54:03,521 INFO [train.py:845] (3/4) Start epoch 705 2024-02-24 02:54:25,665 INFO [train.py:471] (3/4) Epoch 705, batch 2, global_batch_idx: 26050, batch size: 90, loss[discriminator_loss=2.477, discriminator_real_loss=1.103, discriminator_fake_loss=1.375, generator_loss=31.26, generator_mel_loss=20.17, generator_kl_loss=1.926, generator_dur_loss=1.478, generator_adv_loss=2.443, generator_feat_match_loss=5.242, over 90.00 samples.], tot_loss[discriminator_loss=2.458, discriminator_real_loss=1.196, discriminator_fake_loss=1.262, generator_loss=30.99, generator_mel_loss=20, generator_kl_loss=1.935, generator_dur_loss=1.498, generator_adv_loss=2.375, generator_feat_match_loss=5.176, over 197.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 02:57:32,043 INFO [train.py:845] (3/4) Start epoch 706 2024-02-24 02:59:06,167 INFO [train.py:471] (3/4) Epoch 706, batch 15, global_batch_idx: 26100, batch size: 54, loss[discriminator_loss=2.695, discriminator_real_loss=1.373, discriminator_fake_loss=1.323, generator_loss=30, generator_mel_loss=20.31, generator_kl_loss=1.999, generator_dur_loss=1.514, generator_adv_loss=2.152, generator_feat_match_loss=4.031, over 54.00 samples.], tot_loss[discriminator_loss=2.586, discriminator_real_loss=1.305, discriminator_fake_loss=1.282, generator_loss=30.14, generator_mel_loss=20.27, generator_kl_loss=1.956, generator_dur_loss=1.482, generator_adv_loss=2.106, generator_feat_match_loss=4.321, over 1039.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:00:58,126 INFO [train.py:845] (3/4) Start epoch 707 2024-02-24 03:03:40,066 INFO [train.py:471] (3/4) Epoch 707, batch 28, global_batch_idx: 26150, batch size: 69, loss[discriminator_loss=2.637, discriminator_real_loss=1.447, discriminator_fake_loss=1.188, generator_loss=30.13, generator_mel_loss=20.32, generator_kl_loss=1.926, generator_dur_loss=1.466, generator_adv_loss=2.082, generator_feat_match_loss=4.336, over 69.00 samples.], tot_loss[discriminator_loss=2.636, discriminator_real_loss=1.344, discriminator_fake_loss=1.292, generator_loss=30.19, generator_mel_loss=20.36, generator_kl_loss=1.946, generator_dur_loss=1.471, generator_adv_loss=2.139, generator_feat_match_loss=4.272, over 2214.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:04:26,824 INFO [train.py:845] (3/4) Start epoch 708 2024-02-24 03:07:59,613 INFO [train.py:845] (3/4) Start epoch 709 2024-02-24 03:08:32,341 INFO [train.py:471] (3/4) Epoch 709, batch 4, global_batch_idx: 26200, batch size: 73, loss[discriminator_loss=2.715, discriminator_real_loss=1.296, discriminator_fake_loss=1.418, generator_loss=29.51, generator_mel_loss=20.1, generator_kl_loss=1.97, generator_dur_loss=1.479, generator_adv_loss=2.082, generator_feat_match_loss=3.879, over 73.00 samples.], tot_loss[discriminator_loss=2.637, discriminator_real_loss=1.329, discriminator_fake_loss=1.307, generator_loss=29.82, generator_mel_loss=19.99, generator_kl_loss=1.908, generator_dur_loss=1.475, generator_adv_loss=2.186, generator_feat_match_loss=4.259, over 320.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:08:32,343 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 03:08:41,300 INFO [train.py:534] (3/4) Epoch 709, validation: discriminator_loss=2.635, discriminator_real_loss=1.287, discriminator_fake_loss=1.347, generator_loss=30.21, generator_mel_loss=20.54, generator_kl_loss=2.063, generator_dur_loss=1.469, generator_adv_loss=1.94, generator_feat_match_loss=4.192, over 100.00 samples. 2024-02-24 03:08:41,300 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 03:11:31,833 INFO [train.py:845] (3/4) Start epoch 710 2024-02-24 03:13:25,661 INFO [train.py:471] (3/4) Epoch 710, batch 17, global_batch_idx: 26250, batch size: 61, loss[discriminator_loss=2.609, discriminator_real_loss=1.184, discriminator_fake_loss=1.425, generator_loss=29.35, generator_mel_loss=19.87, generator_kl_loss=1.976, generator_dur_loss=1.494, generator_adv_loss=2.189, generator_feat_match_loss=3.822, over 61.00 samples.], tot_loss[discriminator_loss=2.589, discriminator_real_loss=1.314, discriminator_fake_loss=1.276, generator_loss=30.37, generator_mel_loss=20.21, generator_kl_loss=1.948, generator_dur_loss=1.474, generator_adv_loss=2.262, generator_feat_match_loss=4.479, over 1366.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 03:15:04,281 INFO [train.py:845] (3/4) Start epoch 711 2024-02-24 03:18:09,996 INFO [train.py:471] (3/4) Epoch 711, batch 30, global_batch_idx: 26300, batch size: 85, loss[discriminator_loss=2.578, discriminator_real_loss=1.399, discriminator_fake_loss=1.179, generator_loss=30.25, generator_mel_loss=20.36, generator_kl_loss=1.976, generator_dur_loss=1.467, generator_adv_loss=2.07, generator_feat_match_loss=4.371, over 85.00 samples.], tot_loss[discriminator_loss=2.567, discriminator_real_loss=1.296, discriminator_fake_loss=1.272, generator_loss=30.27, generator_mel_loss=20.23, generator_kl_loss=1.94, generator_dur_loss=1.472, generator_adv_loss=2.178, generator_feat_match_loss=4.455, over 2492.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:18:38,257 INFO [train.py:845] (3/4) Start epoch 712 2024-02-24 03:22:05,957 INFO [train.py:845] (3/4) Start epoch 713 2024-02-24 03:22:54,930 INFO [train.py:471] (3/4) Epoch 713, batch 6, global_batch_idx: 26350, batch size: 69, loss[discriminator_loss=2.629, discriminator_real_loss=1.271, discriminator_fake_loss=1.356, generator_loss=29.94, generator_mel_loss=20.12, generator_kl_loss=1.932, generator_dur_loss=1.475, generator_adv_loss=2.287, generator_feat_match_loss=4.121, over 69.00 samples.], tot_loss[discriminator_loss=2.627, discriminator_real_loss=1.311, discriminator_fake_loss=1.316, generator_loss=29.83, generator_mel_loss=20.03, generator_kl_loss=1.948, generator_dur_loss=1.476, generator_adv_loss=2.159, generator_feat_match_loss=4.218, over 460.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:25:37,886 INFO [train.py:845] (3/4) Start epoch 714 2024-02-24 03:27:35,529 INFO [train.py:471] (3/4) Epoch 714, batch 19, global_batch_idx: 26400, batch size: 49, loss[discriminator_loss=2.459, discriminator_real_loss=1.308, discriminator_fake_loss=1.151, generator_loss=30.61, generator_mel_loss=20.1, generator_kl_loss=1.984, generator_dur_loss=1.482, generator_adv_loss=2.238, generator_feat_match_loss=4.809, over 49.00 samples.], tot_loss[discriminator_loss=2.622, discriminator_real_loss=1.362, discriminator_fake_loss=1.261, generator_loss=30.48, generator_mel_loss=20.21, generator_kl_loss=1.936, generator_dur_loss=1.475, generator_adv_loss=2.295, generator_feat_match_loss=4.562, over 1366.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:27:35,531 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 03:27:43,865 INFO [train.py:534] (3/4) Epoch 714, validation: discriminator_loss=2.401, discriminator_real_loss=1.193, discriminator_fake_loss=1.208, generator_loss=31.51, generator_mel_loss=20.91, generator_kl_loss=2.072, generator_dur_loss=1.469, generator_adv_loss=2.182, generator_feat_match_loss=4.871, over 100.00 samples. 2024-02-24 03:27:43,867 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 03:29:17,465 INFO [train.py:845] (3/4) Start epoch 715 2024-02-24 03:32:29,870 INFO [train.py:471] (3/4) Epoch 715, batch 32, global_batch_idx: 26450, batch size: 56, loss[discriminator_loss=2.621, discriminator_real_loss=1.325, discriminator_fake_loss=1.295, generator_loss=29.76, generator_mel_loss=20.03, generator_kl_loss=1.905, generator_dur_loss=1.47, generator_adv_loss=2.078, generator_feat_match_loss=4.285, over 56.00 samples.], tot_loss[discriminator_loss=2.584, discriminator_real_loss=1.304, discriminator_fake_loss=1.28, generator_loss=30.04, generator_mel_loss=20.12, generator_kl_loss=1.94, generator_dur_loss=1.472, generator_adv_loss=2.149, generator_feat_match_loss=4.364, over 2357.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:32:46,896 INFO [train.py:845] (3/4) Start epoch 716 2024-02-24 03:36:12,887 INFO [train.py:845] (3/4) Start epoch 717 2024-02-24 03:37:04,812 INFO [train.py:471] (3/4) Epoch 717, batch 8, global_batch_idx: 26500, batch size: 56, loss[discriminator_loss=2.629, discriminator_real_loss=1.077, discriminator_fake_loss=1.551, generator_loss=30.43, generator_mel_loss=20.05, generator_kl_loss=1.961, generator_dur_loss=1.458, generator_adv_loss=2.182, generator_feat_match_loss=4.781, over 56.00 samples.], tot_loss[discriminator_loss=2.629, discriminator_real_loss=1.294, discriminator_fake_loss=1.336, generator_loss=30.37, generator_mel_loss=20.05, generator_kl_loss=1.887, generator_dur_loss=1.464, generator_adv_loss=2.283, generator_feat_match_loss=4.686, over 609.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:39:38,637 INFO [train.py:845] (3/4) Start epoch 718 2024-02-24 03:41:50,065 INFO [train.py:471] (3/4) Epoch 718, batch 21, global_batch_idx: 26550, batch size: 101, loss[discriminator_loss=2.611, discriminator_real_loss=1.335, discriminator_fake_loss=1.276, generator_loss=30.13, generator_mel_loss=20.39, generator_kl_loss=2.001, generator_dur_loss=1.47, generator_adv_loss=2.15, generator_feat_match_loss=4.125, over 101.00 samples.], tot_loss[discriminator_loss=2.628, discriminator_real_loss=1.326, discriminator_fake_loss=1.303, generator_loss=30.07, generator_mel_loss=20.33, generator_kl_loss=1.939, generator_dur_loss=1.475, generator_adv_loss=2.121, generator_feat_match_loss=4.207, over 1564.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:43:08,511 INFO [train.py:845] (3/4) Start epoch 719 2024-02-24 03:46:26,625 INFO [train.py:471] (3/4) Epoch 719, batch 34, global_batch_idx: 26600, batch size: 81, loss[discriminator_loss=2.59, discriminator_real_loss=1.339, discriminator_fake_loss=1.251, generator_loss=30.65, generator_mel_loss=20.07, generator_kl_loss=2.012, generator_dur_loss=1.468, generator_adv_loss=2.436, generator_feat_match_loss=4.672, over 81.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.317, discriminator_fake_loss=1.298, generator_loss=30.18, generator_mel_loss=20.27, generator_kl_loss=1.925, generator_dur_loss=1.476, generator_adv_loss=2.171, generator_feat_match_loss=4.341, over 2675.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:46:26,627 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 03:46:35,549 INFO [train.py:534] (3/4) Epoch 719, validation: discriminator_loss=2.669, discriminator_real_loss=1.318, discriminator_fake_loss=1.35, generator_loss=30.97, generator_mel_loss=21.19, generator_kl_loss=2.045, generator_dur_loss=1.477, generator_adv_loss=1.949, generator_feat_match_loss=4.31, over 100.00 samples. 2024-02-24 03:46:35,550 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 03:46:45,682 INFO [train.py:845] (3/4) Start epoch 720 2024-02-24 03:50:11,735 INFO [train.py:845] (3/4) Start epoch 721 2024-02-24 03:51:15,791 INFO [train.py:471] (3/4) Epoch 721, batch 10, global_batch_idx: 26650, batch size: 64, loss[discriminator_loss=2.537, discriminator_real_loss=1.355, discriminator_fake_loss=1.182, generator_loss=30.98, generator_mel_loss=20.21, generator_kl_loss=1.947, generator_dur_loss=1.487, generator_adv_loss=2.562, generator_feat_match_loss=4.777, over 64.00 samples.], tot_loss[discriminator_loss=2.544, discriminator_real_loss=1.291, discriminator_fake_loss=1.253, generator_loss=30.57, generator_mel_loss=20.1, generator_kl_loss=1.918, generator_dur_loss=1.474, generator_adv_loss=2.375, generator_feat_match_loss=4.704, over 819.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 03:53:32,889 INFO [train.py:845] (3/4) Start epoch 722 2024-02-24 03:55:53,683 INFO [train.py:471] (3/4) Epoch 722, batch 23, global_batch_idx: 26700, batch size: 90, loss[discriminator_loss=2.457, discriminator_real_loss=1.404, discriminator_fake_loss=1.053, generator_loss=31.14, generator_mel_loss=19.92, generator_kl_loss=1.973, generator_dur_loss=1.477, generator_adv_loss=2.527, generator_feat_match_loss=5.238, over 90.00 samples.], tot_loss[discriminator_loss=2.533, discriminator_real_loss=1.279, discriminator_fake_loss=1.255, generator_loss=30.57, generator_mel_loss=20.06, generator_kl_loss=1.941, generator_dur_loss=1.474, generator_adv_loss=2.365, generator_feat_match_loss=4.723, over 1702.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 03:57:01,268 INFO [train.py:845] (3/4) Start epoch 723 2024-02-24 04:00:33,901 INFO [train.py:471] (3/4) Epoch 723, batch 36, global_batch_idx: 26750, batch size: 85, loss[discriminator_loss=2.471, discriminator_real_loss=1.193, discriminator_fake_loss=1.277, generator_loss=30.18, generator_mel_loss=20.08, generator_kl_loss=1.981, generator_dur_loss=1.47, generator_adv_loss=2.141, generator_feat_match_loss=4.508, over 85.00 samples.], tot_loss[discriminator_loss=2.568, discriminator_real_loss=1.304, discriminator_fake_loss=1.264, generator_loss=30.46, generator_mel_loss=20.18, generator_kl_loss=1.953, generator_dur_loss=1.475, generator_adv_loss=2.282, generator_feat_match_loss=4.564, over 2759.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 04:00:34,339 INFO [train.py:845] (3/4) Start epoch 724 2024-02-24 04:04:03,229 INFO [train.py:845] (3/4) Start epoch 725 2024-02-24 04:05:19,263 INFO [train.py:471] (3/4) Epoch 725, batch 12, global_batch_idx: 26800, batch size: 50, loss[discriminator_loss=2.443, discriminator_real_loss=1.154, discriminator_fake_loss=1.289, generator_loss=30.23, generator_mel_loss=19.94, generator_kl_loss=1.957, generator_dur_loss=1.448, generator_adv_loss=2.24, generator_feat_match_loss=4.652, over 50.00 samples.], tot_loss[discriminator_loss=2.53, discriminator_real_loss=1.275, discriminator_fake_loss=1.254, generator_loss=30.6, generator_mel_loss=20.2, generator_kl_loss=1.958, generator_dur_loss=1.467, generator_adv_loss=2.304, generator_feat_match_loss=4.671, over 957.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 04:05:19,265 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 04:05:28,171 INFO [train.py:534] (3/4) Epoch 725, validation: discriminator_loss=2.465, discriminator_real_loss=1.132, discriminator_fake_loss=1.333, generator_loss=30.66, generator_mel_loss=20.64, generator_kl_loss=2.01, generator_dur_loss=1.475, generator_adv_loss=1.965, generator_feat_match_loss=4.567, over 100.00 samples. 2024-02-24 04:05:28,171 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 04:07:42,293 INFO [train.py:845] (3/4) Start epoch 726 2024-02-24 04:10:07,116 INFO [train.py:471] (3/4) Epoch 726, batch 25, global_batch_idx: 26850, batch size: 81, loss[discriminator_loss=2.602, discriminator_real_loss=1.356, discriminator_fake_loss=1.246, generator_loss=30.26, generator_mel_loss=19.95, generator_kl_loss=1.936, generator_dur_loss=1.476, generator_adv_loss=2.291, generator_feat_match_loss=4.602, over 81.00 samples.], tot_loss[discriminator_loss=2.557, discriminator_real_loss=1.296, discriminator_fake_loss=1.261, generator_loss=30.4, generator_mel_loss=20.13, generator_kl_loss=1.947, generator_dur_loss=1.477, generator_adv_loss=2.268, generator_feat_match_loss=4.583, over 1891.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 04:11:10,535 INFO [train.py:845] (3/4) Start epoch 727 2024-02-24 04:14:37,512 INFO [train.py:845] (3/4) Start epoch 728 2024-02-24 04:14:57,106 INFO [train.py:471] (3/4) Epoch 728, batch 1, global_batch_idx: 26900, batch size: 50, loss[discriminator_loss=2.441, discriminator_real_loss=1.253, discriminator_fake_loss=1.189, generator_loss=30.28, generator_mel_loss=19.66, generator_kl_loss=1.963, generator_dur_loss=1.503, generator_adv_loss=2.258, generator_feat_match_loss=4.898, over 50.00 samples.], tot_loss[discriminator_loss=2.417, discriminator_real_loss=1.232, discriminator_fake_loss=1.186, generator_loss=30.19, generator_mel_loss=19.68, generator_kl_loss=1.913, generator_dur_loss=1.484, generator_adv_loss=2.338, generator_feat_match_loss=4.772, over 113.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 04:18:06,161 INFO [train.py:845] (3/4) Start epoch 729 2024-02-24 04:19:37,808 INFO [train.py:471] (3/4) Epoch 729, batch 14, global_batch_idx: 26950, batch size: 95, loss[discriminator_loss=2.502, discriminator_real_loss=1.221, discriminator_fake_loss=1.281, generator_loss=30.36, generator_mel_loss=20.16, generator_kl_loss=1.999, generator_dur_loss=1.477, generator_adv_loss=2.178, generator_feat_match_loss=4.547, over 95.00 samples.], tot_loss[discriminator_loss=2.593, discriminator_real_loss=1.329, discriminator_fake_loss=1.263, generator_loss=30.19, generator_mel_loss=19.96, generator_kl_loss=1.932, generator_dur_loss=1.476, generator_adv_loss=2.26, generator_feat_match_loss=4.559, over 948.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 04:21:39,647 INFO [train.py:845] (3/4) Start epoch 730 2024-02-24 04:24:22,672 INFO [train.py:471] (3/4) Epoch 730, batch 27, global_batch_idx: 27000, batch size: 63, loss[discriminator_loss=2.539, discriminator_real_loss=1.271, discriminator_fake_loss=1.269, generator_loss=29.81, generator_mel_loss=19.8, generator_kl_loss=2.05, generator_dur_loss=1.493, generator_adv_loss=2.238, generator_feat_match_loss=4.227, over 63.00 samples.], tot_loss[discriminator_loss=2.583, discriminator_real_loss=1.292, discriminator_fake_loss=1.291, generator_loss=30.61, generator_mel_loss=20.35, generator_kl_loss=1.958, generator_dur_loss=1.472, generator_adv_loss=2.265, generator_feat_match_loss=4.561, over 2068.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 04:24:22,673 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 04:24:31,407 INFO [train.py:534] (3/4) Epoch 730, validation: discriminator_loss=2.456, discriminator_real_loss=1.185, discriminator_fake_loss=1.272, generator_loss=31.51, generator_mel_loss=21.07, generator_kl_loss=2.036, generator_dur_loss=1.475, generator_adv_loss=2.24, generator_feat_match_loss=4.69, over 100.00 samples. 2024-02-24 04:24:31,408 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 04:25:19,545 INFO [train.py:845] (3/4) Start epoch 731 2024-02-24 04:28:48,504 INFO [train.py:845] (3/4) Start epoch 732 2024-02-24 04:29:20,237 INFO [train.py:471] (3/4) Epoch 732, batch 3, global_batch_idx: 27050, batch size: 60, loss[discriminator_loss=2.471, discriminator_real_loss=1.309, discriminator_fake_loss=1.162, generator_loss=30.16, generator_mel_loss=20.26, generator_kl_loss=1.948, generator_dur_loss=1.482, generator_adv_loss=2.096, generator_feat_match_loss=4.375, over 60.00 samples.], tot_loss[discriminator_loss=2.615, discriminator_real_loss=1.283, discriminator_fake_loss=1.333, generator_loss=29.94, generator_mel_loss=20.09, generator_kl_loss=1.947, generator_dur_loss=1.468, generator_adv_loss=2.125, generator_feat_match_loss=4.308, over 330.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 4.0 2024-02-24 04:32:12,247 INFO [train.py:845] (3/4) Start epoch 733 2024-02-24 04:33:50,834 INFO [train.py:471] (3/4) Epoch 733, batch 16, global_batch_idx: 27100, batch size: 52, loss[discriminator_loss=2.533, discriminator_real_loss=1.373, discriminator_fake_loss=1.16, generator_loss=30.31, generator_mel_loss=19.72, generator_kl_loss=1.966, generator_dur_loss=1.486, generator_adv_loss=2.543, generator_feat_match_loss=4.598, over 52.00 samples.], tot_loss[discriminator_loss=2.557, discriminator_real_loss=1.289, discriminator_fake_loss=1.269, generator_loss=30.55, generator_mel_loss=20.2, generator_kl_loss=1.961, generator_dur_loss=1.477, generator_adv_loss=2.296, generator_feat_match_loss=4.609, over 1170.00 samples.], cur_lr_g: 1.83e-04, cur_lr_d: 1.83e-04, grad_scale: 8.0 2024-02-24 04:35:37,702 INFO [train.py:845] (3/4) Start epoch 734 2024-02-24 04:38:20,760 INFO [train.py:471] (3/4) Epoch 734, batch 29, global_batch_idx: 27150, batch size: 85, loss[discriminator_loss=2.652, discriminator_real_loss=1.215, discriminator_fake_loss=1.438, generator_loss=29.51, generator_mel_loss=19.81, generator_kl_loss=1.979, generator_dur_loss=1.463, generator_adv_loss=2.145, generator_feat_match_loss=4.109, over 85.00 samples.], tot_loss[discriminator_loss=2.547, discriminator_real_loss=1.289, discriminator_fake_loss=1.257, generator_loss=30.25, generator_mel_loss=20.08, generator_kl_loss=1.938, generator_dur_loss=1.47, generator_adv_loss=2.225, generator_feat_match_loss=4.54, over 2181.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 04:39:01,723 INFO [train.py:845] (3/4) Start epoch 735 2024-02-24 04:42:30,187 INFO [train.py:845] (3/4) Start epoch 736 2024-02-24 04:43:15,010 INFO [train.py:471] (3/4) Epoch 736, batch 5, global_batch_idx: 27200, batch size: 73, loss[discriminator_loss=2.385, discriminator_real_loss=1.168, discriminator_fake_loss=1.217, generator_loss=30.77, generator_mel_loss=20.09, generator_kl_loss=1.923, generator_dur_loss=1.464, generator_adv_loss=2.207, generator_feat_match_loss=5.086, over 73.00 samples.], tot_loss[discriminator_loss=2.606, discriminator_real_loss=1.364, discriminator_fake_loss=1.242, generator_loss=30.17, generator_mel_loss=19.95, generator_kl_loss=1.944, generator_dur_loss=1.464, generator_adv_loss=2.255, generator_feat_match_loss=4.56, over 569.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 04:43:15,012 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 04:43:24,066 INFO [train.py:534] (3/4) Epoch 736, validation: discriminator_loss=2.546, discriminator_real_loss=1.136, discriminator_fake_loss=1.41, generator_loss=30.93, generator_mel_loss=20.71, generator_kl_loss=2.152, generator_dur_loss=1.473, generator_adv_loss=1.866, generator_feat_match_loss=4.735, over 100.00 samples. 2024-02-24 04:43:24,067 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 04:46:09,428 INFO [train.py:845] (3/4) Start epoch 737 2024-02-24 04:48:08,529 INFO [train.py:471] (3/4) Epoch 737, batch 18, global_batch_idx: 27250, batch size: 101, loss[discriminator_loss=2.547, discriminator_real_loss=1.396, discriminator_fake_loss=1.151, generator_loss=31.05, generator_mel_loss=20.4, generator_kl_loss=1.973, generator_dur_loss=1.466, generator_adv_loss=2.41, generator_feat_match_loss=4.809, over 101.00 samples.], tot_loss[discriminator_loss=2.599, discriminator_real_loss=1.315, discriminator_fake_loss=1.284, generator_loss=30.69, generator_mel_loss=20.27, generator_kl_loss=1.953, generator_dur_loss=1.467, generator_adv_loss=2.308, generator_feat_match_loss=4.699, over 1494.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 04:49:41,124 INFO [train.py:845] (3/4) Start epoch 738 2024-02-24 04:52:38,544 INFO [train.py:471] (3/4) Epoch 738, batch 31, global_batch_idx: 27300, batch size: 101, loss[discriminator_loss=2.516, discriminator_real_loss=1.178, discriminator_fake_loss=1.339, generator_loss=30.79, generator_mel_loss=20.31, generator_kl_loss=1.897, generator_dur_loss=1.453, generator_adv_loss=2.373, generator_feat_match_loss=4.762, over 101.00 samples.], tot_loss[discriminator_loss=2.56, discriminator_real_loss=1.298, discriminator_fake_loss=1.263, generator_loss=30.33, generator_mel_loss=20.12, generator_kl_loss=1.947, generator_dur_loss=1.468, generator_adv_loss=2.226, generator_feat_match_loss=4.572, over 2441.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 04:53:05,656 INFO [train.py:845] (3/4) Start epoch 739 2024-02-24 04:56:35,448 INFO [train.py:845] (3/4) Start epoch 740 2024-02-24 04:57:22,315 INFO [train.py:471] (3/4) Epoch 740, batch 7, global_batch_idx: 27350, batch size: 49, loss[discriminator_loss=2.566, discriminator_real_loss=1.361, discriminator_fake_loss=1.204, generator_loss=30.18, generator_mel_loss=19.79, generator_kl_loss=1.92, generator_dur_loss=1.491, generator_adv_loss=2.414, generator_feat_match_loss=4.57, over 49.00 samples.], tot_loss[discriminator_loss=2.558, discriminator_real_loss=1.285, discriminator_fake_loss=1.273, generator_loss=29.99, generator_mel_loss=19.89, generator_kl_loss=1.943, generator_dur_loss=1.479, generator_adv_loss=2.23, generator_feat_match_loss=4.452, over 465.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:00:04,883 INFO [train.py:845] (3/4) Start epoch 741 2024-02-24 05:02:06,948 INFO [train.py:471] (3/4) Epoch 741, batch 20, global_batch_idx: 27400, batch size: 50, loss[discriminator_loss=2.566, discriminator_real_loss=1.254, discriminator_fake_loss=1.312, generator_loss=29.38, generator_mel_loss=19.49, generator_kl_loss=1.882, generator_dur_loss=1.467, generator_adv_loss=2.23, generator_feat_match_loss=4.309, over 50.00 samples.], tot_loss[discriminator_loss=2.536, discriminator_real_loss=1.281, discriminator_fake_loss=1.255, generator_loss=30.28, generator_mel_loss=19.79, generator_kl_loss=1.92, generator_dur_loss=1.479, generator_adv_loss=2.339, generator_feat_match_loss=4.749, over 1377.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:02:06,950 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 05:02:14,553 INFO [train.py:534] (3/4) Epoch 741, validation: discriminator_loss=2.579, discriminator_real_loss=1.24, discriminator_fake_loss=1.338, generator_loss=30.43, generator_mel_loss=20.39, generator_kl_loss=2.015, generator_dur_loss=1.472, generator_adv_loss=2.07, generator_feat_match_loss=4.485, over 100.00 samples. 2024-02-24 05:02:14,554 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 05:03:42,516 INFO [train.py:845] (3/4) Start epoch 742 2024-02-24 05:06:54,792 INFO [train.py:471] (3/4) Epoch 742, batch 33, global_batch_idx: 27450, batch size: 58, loss[discriminator_loss=2.551, discriminator_real_loss=1.249, discriminator_fake_loss=1.301, generator_loss=29.99, generator_mel_loss=19.79, generator_kl_loss=1.903, generator_dur_loss=1.495, generator_adv_loss=2.301, generator_feat_match_loss=4.496, over 58.00 samples.], tot_loss[discriminator_loss=2.534, discriminator_real_loss=1.268, discriminator_fake_loss=1.266, generator_loss=30.54, generator_mel_loss=20.06, generator_kl_loss=1.958, generator_dur_loss=1.472, generator_adv_loss=2.322, generator_feat_match_loss=4.731, over 2276.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:07:12,814 INFO [train.py:845] (3/4) Start epoch 743 2024-02-24 05:10:45,648 INFO [train.py:845] (3/4) Start epoch 744 2024-02-24 05:11:46,010 INFO [train.py:471] (3/4) Epoch 744, batch 9, global_batch_idx: 27500, batch size: 110, loss[discriminator_loss=2.607, discriminator_real_loss=1.462, discriminator_fake_loss=1.146, generator_loss=31.06, generator_mel_loss=20.14, generator_kl_loss=1.951, generator_dur_loss=1.471, generator_adv_loss=2.502, generator_feat_match_loss=4.992, over 110.00 samples.], tot_loss[discriminator_loss=2.57, discriminator_real_loss=1.31, discriminator_fake_loss=1.26, generator_loss=30.63, generator_mel_loss=20.16, generator_kl_loss=1.923, generator_dur_loss=1.478, generator_adv_loss=2.332, generator_feat_match_loss=4.738, over 660.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:14:15,423 INFO [train.py:845] (3/4) Start epoch 745 2024-02-24 05:16:27,698 INFO [train.py:471] (3/4) Epoch 745, batch 22, global_batch_idx: 27550, batch size: 95, loss[discriminator_loss=2.613, discriminator_real_loss=1.228, discriminator_fake_loss=1.386, generator_loss=29.71, generator_mel_loss=20, generator_kl_loss=2.056, generator_dur_loss=1.45, generator_adv_loss=2.098, generator_feat_match_loss=4.109, over 95.00 samples.], tot_loss[discriminator_loss=2.533, discriminator_real_loss=1.276, discriminator_fake_loss=1.258, generator_loss=30.62, generator_mel_loss=20.16, generator_kl_loss=1.918, generator_dur_loss=1.476, generator_adv_loss=2.312, generator_feat_match_loss=4.748, over 1606.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 4.0 2024-02-24 05:17:40,960 INFO [train.py:845] (3/4) Start epoch 746 2024-02-24 05:20:59,495 INFO [train.py:471] (3/4) Epoch 746, batch 35, global_batch_idx: 27600, batch size: 52, loss[discriminator_loss=2.688, discriminator_real_loss=1.158, discriminator_fake_loss=1.528, generator_loss=30.66, generator_mel_loss=20.37, generator_kl_loss=1.914, generator_dur_loss=1.47, generator_adv_loss=2.143, generator_feat_match_loss=4.762, over 52.00 samples.], tot_loss[discriminator_loss=2.512, discriminator_real_loss=1.267, discriminator_fake_loss=1.245, generator_loss=30.51, generator_mel_loss=19.91, generator_kl_loss=1.936, generator_dur_loss=1.466, generator_adv_loss=2.336, generator_feat_match_loss=4.858, over 2932.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:20:59,497 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 05:21:08,493 INFO [train.py:534] (3/4) Epoch 746, validation: discriminator_loss=2.729, discriminator_real_loss=1.148, discriminator_fake_loss=1.581, generator_loss=29.59, generator_mel_loss=20.25, generator_kl_loss=2.084, generator_dur_loss=1.47, generator_adv_loss=1.658, generator_feat_match_loss=4.128, over 100.00 samples. 2024-02-24 05:21:08,494 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 05:21:15,450 INFO [train.py:845] (3/4) Start epoch 747 2024-02-24 05:24:37,177 INFO [train.py:845] (3/4) Start epoch 748 2024-02-24 05:25:53,303 INFO [train.py:471] (3/4) Epoch 748, batch 11, global_batch_idx: 27650, batch size: 63, loss[discriminator_loss=2.445, discriminator_real_loss=1.285, discriminator_fake_loss=1.16, generator_loss=31.03, generator_mel_loss=20.09, generator_kl_loss=1.897, generator_dur_loss=1.465, generator_adv_loss=2.604, generator_feat_match_loss=4.973, over 63.00 samples.], tot_loss[discriminator_loss=2.531, discriminator_real_loss=1.273, discriminator_fake_loss=1.258, generator_loss=30.9, generator_mel_loss=20.22, generator_kl_loss=1.945, generator_dur_loss=1.472, generator_adv_loss=2.406, generator_feat_match_loss=4.853, over 896.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 4.0 2024-02-24 05:28:09,816 INFO [train.py:845] (3/4) Start epoch 749 2024-02-24 05:30:39,227 INFO [train.py:471] (3/4) Epoch 749, batch 24, global_batch_idx: 27700, batch size: 64, loss[discriminator_loss=2.512, discriminator_real_loss=1.271, discriminator_fake_loss=1.24, generator_loss=30.37, generator_mel_loss=20.1, generator_kl_loss=1.926, generator_dur_loss=1.459, generator_adv_loss=2.256, generator_feat_match_loss=4.629, over 64.00 samples.], tot_loss[discriminator_loss=2.554, discriminator_real_loss=1.302, discriminator_fake_loss=1.252, generator_loss=30.18, generator_mel_loss=19.91, generator_kl_loss=1.93, generator_dur_loss=1.477, generator_adv_loss=2.274, generator_feat_match_loss=4.589, over 1638.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:31:41,672 INFO [train.py:845] (3/4) Start epoch 750 2024-02-24 05:35:09,911 INFO [train.py:845] (3/4) Start epoch 751 2024-02-24 05:35:23,451 INFO [train.py:471] (3/4) Epoch 751, batch 0, global_batch_idx: 27750, batch size: 81, loss[discriminator_loss=2.385, discriminator_real_loss=1.268, discriminator_fake_loss=1.117, generator_loss=31.12, generator_mel_loss=20.13, generator_kl_loss=1.817, generator_dur_loss=1.461, generator_adv_loss=2.553, generator_feat_match_loss=5.16, over 81.00 samples.], tot_loss[discriminator_loss=2.385, discriminator_real_loss=1.268, discriminator_fake_loss=1.117, generator_loss=31.12, generator_mel_loss=20.13, generator_kl_loss=1.817, generator_dur_loss=1.461, generator_adv_loss=2.553, generator_feat_match_loss=5.16, over 81.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 4.0 2024-02-24 05:38:34,830 INFO [train.py:845] (3/4) Start epoch 752 2024-02-24 05:39:58,364 INFO [train.py:471] (3/4) Epoch 752, batch 13, global_batch_idx: 27800, batch size: 73, loss[discriminator_loss=2.443, discriminator_real_loss=1.285, discriminator_fake_loss=1.158, generator_loss=30.01, generator_mel_loss=19.62, generator_kl_loss=1.894, generator_dur_loss=1.473, generator_adv_loss=2.277, generator_feat_match_loss=4.742, over 73.00 samples.], tot_loss[discriminator_loss=2.528, discriminator_real_loss=1.291, discriminator_fake_loss=1.237, generator_loss=30.19, generator_mel_loss=19.86, generator_kl_loss=1.908, generator_dur_loss=1.473, generator_adv_loss=2.256, generator_feat_match_loss=4.691, over 1074.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:39:58,366 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 05:40:06,985 INFO [train.py:534] (3/4) Epoch 752, validation: discriminator_loss=2.539, discriminator_real_loss=1.153, discriminator_fake_loss=1.386, generator_loss=30.33, generator_mel_loss=20.37, generator_kl_loss=1.985, generator_dur_loss=1.475, generator_adv_loss=1.972, generator_feat_match_loss=4.532, over 100.00 samples. 2024-02-24 05:40:06,985 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 05:42:13,986 INFO [train.py:845] (3/4) Start epoch 753 2024-02-24 05:44:47,656 INFO [train.py:471] (3/4) Epoch 753, batch 26, global_batch_idx: 27850, batch size: 52, loss[discriminator_loss=2.492, discriminator_real_loss=1.295, discriminator_fake_loss=1.198, generator_loss=30.81, generator_mel_loss=20.29, generator_kl_loss=1.968, generator_dur_loss=1.48, generator_adv_loss=2.398, generator_feat_match_loss=4.68, over 52.00 samples.], tot_loss[discriminator_loss=2.561, discriminator_real_loss=1.275, discriminator_fake_loss=1.286, generator_loss=30.55, generator_mel_loss=20.14, generator_kl_loss=1.926, generator_dur_loss=1.467, generator_adv_loss=2.291, generator_feat_match_loss=4.731, over 2052.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:45:43,240 INFO [train.py:845] (3/4) Start epoch 754 2024-02-24 05:49:18,185 INFO [train.py:845] (3/4) Start epoch 755 2024-02-24 05:49:45,246 INFO [train.py:471] (3/4) Epoch 755, batch 2, global_batch_idx: 27900, batch size: 153, loss[discriminator_loss=2.35, discriminator_real_loss=1.162, discriminator_fake_loss=1.188, generator_loss=31.29, generator_mel_loss=19.9, generator_kl_loss=1.958, generator_dur_loss=1.452, generator_adv_loss=2.441, generator_feat_match_loss=5.539, over 153.00 samples.], tot_loss[discriminator_loss=2.364, discriminator_real_loss=1.227, discriminator_fake_loss=1.137, generator_loss=31.15, generator_mel_loss=19.86, generator_kl_loss=1.959, generator_dur_loss=1.463, generator_adv_loss=2.439, generator_feat_match_loss=5.432, over 293.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:52:48,925 INFO [train.py:845] (3/4) Start epoch 756 2024-02-24 05:54:23,586 INFO [train.py:471] (3/4) Epoch 756, batch 15, global_batch_idx: 27950, batch size: 52, loss[discriminator_loss=2.566, discriminator_real_loss=1.414, discriminator_fake_loss=1.152, generator_loss=29.93, generator_mel_loss=20, generator_kl_loss=1.978, generator_dur_loss=1.496, generator_adv_loss=2.209, generator_feat_match_loss=4.25, over 52.00 samples.], tot_loss[discriminator_loss=2.549, discriminator_real_loss=1.271, discriminator_fake_loss=1.278, generator_loss=30.23, generator_mel_loss=20.02, generator_kl_loss=1.95, generator_dur_loss=1.469, generator_adv_loss=2.226, generator_feat_match_loss=4.561, over 1148.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 4.0 2024-02-24 05:56:15,637 INFO [train.py:845] (3/4) Start epoch 757 2024-02-24 05:58:57,619 INFO [train.py:471] (3/4) Epoch 757, batch 28, global_batch_idx: 28000, batch size: 85, loss[discriminator_loss=2.66, discriminator_real_loss=1.187, discriminator_fake_loss=1.473, generator_loss=29.86, generator_mel_loss=20.07, generator_kl_loss=1.928, generator_dur_loss=1.479, generator_adv_loss=2.043, generator_feat_match_loss=4.332, over 85.00 samples.], tot_loss[discriminator_loss=2.532, discriminator_real_loss=1.257, discriminator_fake_loss=1.275, generator_loss=30.74, generator_mel_loss=20.13, generator_kl_loss=1.962, generator_dur_loss=1.472, generator_adv_loss=2.332, generator_feat_match_loss=4.843, over 2215.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 05:58:57,621 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 05:59:05,929 INFO [train.py:534] (3/4) Epoch 757, validation: discriminator_loss=2.549, discriminator_real_loss=1.255, discriminator_fake_loss=1.294, generator_loss=30.91, generator_mel_loss=20.79, generator_kl_loss=2.058, generator_dur_loss=1.47, generator_adv_loss=2.022, generator_feat_match_loss=4.573, over 100.00 samples. 2024-02-24 05:59:05,930 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 05:59:55,140 INFO [train.py:845] (3/4) Start epoch 758 2024-02-24 06:03:20,300 INFO [train.py:845] (3/4) Start epoch 759 2024-02-24 06:03:54,956 INFO [train.py:471] (3/4) Epoch 759, batch 4, global_batch_idx: 28050, batch size: 67, loss[discriminator_loss=2.48, discriminator_real_loss=1.189, discriminator_fake_loss=1.291, generator_loss=30.69, generator_mel_loss=20.12, generator_kl_loss=1.944, generator_dur_loss=1.479, generator_adv_loss=2.221, generator_feat_match_loss=4.926, over 67.00 samples.], tot_loss[discriminator_loss=2.521, discriminator_real_loss=1.258, discriminator_fake_loss=1.264, generator_loss=30.43, generator_mel_loss=19.95, generator_kl_loss=1.949, generator_dur_loss=1.472, generator_adv_loss=2.291, generator_feat_match_loss=4.76, over 306.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:06:50,737 INFO [train.py:845] (3/4) Start epoch 760 2024-02-24 06:08:47,631 INFO [train.py:471] (3/4) Epoch 760, batch 17, global_batch_idx: 28100, batch size: 85, loss[discriminator_loss=2.594, discriminator_real_loss=1.215, discriminator_fake_loss=1.378, generator_loss=30.3, generator_mel_loss=20.04, generator_kl_loss=1.96, generator_dur_loss=1.459, generator_adv_loss=2.158, generator_feat_match_loss=4.68, over 85.00 samples.], tot_loss[discriminator_loss=2.52, discriminator_real_loss=1.274, discriminator_fake_loss=1.246, generator_loss=30.65, generator_mel_loss=20.02, generator_kl_loss=1.963, generator_dur_loss=1.464, generator_adv_loss=2.343, generator_feat_match_loss=4.858, over 1456.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:10:22,934 INFO [train.py:845] (3/4) Start epoch 761 2024-02-24 06:13:13,229 INFO [train.py:471] (3/4) Epoch 761, batch 30, global_batch_idx: 28150, batch size: 110, loss[discriminator_loss=2.457, discriminator_real_loss=1.24, discriminator_fake_loss=1.217, generator_loss=30.49, generator_mel_loss=20.04, generator_kl_loss=1.945, generator_dur_loss=1.445, generator_adv_loss=2.318, generator_feat_match_loss=4.738, over 110.00 samples.], tot_loss[discriminator_loss=2.53, discriminator_real_loss=1.264, discriminator_fake_loss=1.266, generator_loss=30.38, generator_mel_loss=19.89, generator_kl_loss=1.946, generator_dur_loss=1.468, generator_adv_loss=2.293, generator_feat_match_loss=4.783, over 2447.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:13:50,494 INFO [train.py:845] (3/4) Start epoch 762 2024-02-24 06:17:20,489 INFO [train.py:845] (3/4) Start epoch 763 2024-02-24 06:18:12,367 INFO [train.py:471] (3/4) Epoch 763, batch 6, global_batch_idx: 28200, batch size: 71, loss[discriminator_loss=2.502, discriminator_real_loss=1.231, discriminator_fake_loss=1.271, generator_loss=30.58, generator_mel_loss=19.92, generator_kl_loss=1.905, generator_dur_loss=1.47, generator_adv_loss=2.236, generator_feat_match_loss=5.043, over 71.00 samples.], tot_loss[discriminator_loss=2.515, discriminator_real_loss=1.265, discriminator_fake_loss=1.25, generator_loss=30.85, generator_mel_loss=20.12, generator_kl_loss=1.962, generator_dur_loss=1.458, generator_adv_loss=2.359, generator_feat_match_loss=4.949, over 580.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:18:12,369 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 06:18:21,441 INFO [train.py:534] (3/4) Epoch 763, validation: discriminator_loss=2.717, discriminator_real_loss=1.137, discriminator_fake_loss=1.581, generator_loss=30.59, generator_mel_loss=20.96, generator_kl_loss=1.983, generator_dur_loss=1.474, generator_adv_loss=1.719, generator_feat_match_loss=4.459, over 100.00 samples. 2024-02-24 06:18:21,442 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 06:21:00,394 INFO [train.py:845] (3/4) Start epoch 764 2024-02-24 06:22:51,540 INFO [train.py:471] (3/4) Epoch 764, batch 19, global_batch_idx: 28250, batch size: 95, loss[discriminator_loss=2.51, discriminator_real_loss=1.423, discriminator_fake_loss=1.087, generator_loss=30.8, generator_mel_loss=19.98, generator_kl_loss=1.974, generator_dur_loss=1.44, generator_adv_loss=2.512, generator_feat_match_loss=4.891, over 95.00 samples.], tot_loss[discriminator_loss=2.495, discriminator_real_loss=1.257, discriminator_fake_loss=1.238, generator_loss=30.57, generator_mel_loss=20.04, generator_kl_loss=1.959, generator_dur_loss=1.471, generator_adv_loss=2.343, generator_feat_match_loss=4.756, over 1471.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 4.0 2024-02-24 06:24:26,676 INFO [train.py:845] (3/4) Start epoch 765 2024-02-24 06:27:35,211 INFO [train.py:471] (3/4) Epoch 765, batch 32, global_batch_idx: 28300, batch size: 60, loss[discriminator_loss=2.441, discriminator_real_loss=1.217, discriminator_fake_loss=1.225, generator_loss=31.55, generator_mel_loss=20.57, generator_kl_loss=1.968, generator_dur_loss=1.488, generator_adv_loss=2.447, generator_feat_match_loss=5.078, over 60.00 samples.], tot_loss[discriminator_loss=2.537, discriminator_real_loss=1.282, discriminator_fake_loss=1.256, generator_loss=30.59, generator_mel_loss=20.11, generator_kl_loss=1.961, generator_dur_loss=1.474, generator_adv_loss=2.323, generator_feat_match_loss=4.725, over 2376.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:27:53,236 INFO [train.py:845] (3/4) Start epoch 766 2024-02-24 06:31:14,475 INFO [train.py:845] (3/4) Start epoch 767 2024-02-24 06:32:07,237 INFO [train.py:471] (3/4) Epoch 767, batch 8, global_batch_idx: 28350, batch size: 126, loss[discriminator_loss=2.477, discriminator_real_loss=1.288, discriminator_fake_loss=1.188, generator_loss=31, generator_mel_loss=20.23, generator_kl_loss=2.013, generator_dur_loss=1.452, generator_adv_loss=2.34, generator_feat_match_loss=4.961, over 126.00 samples.], tot_loss[discriminator_loss=2.492, discriminator_real_loss=1.264, discriminator_fake_loss=1.228, generator_loss=30.25, generator_mel_loss=19.79, generator_kl_loss=1.946, generator_dur_loss=1.468, generator_adv_loss=2.296, generator_feat_match_loss=4.746, over 651.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:34:41,223 INFO [train.py:845] (3/4) Start epoch 768 2024-02-24 06:36:45,678 INFO [train.py:471] (3/4) Epoch 768, batch 21, global_batch_idx: 28400, batch size: 79, loss[discriminator_loss=2.438, discriminator_real_loss=1.368, discriminator_fake_loss=1.069, generator_loss=31.19, generator_mel_loss=20.05, generator_kl_loss=1.961, generator_dur_loss=1.486, generator_adv_loss=2.416, generator_feat_match_loss=5.281, over 79.00 samples.], tot_loss[discriminator_loss=2.487, discriminator_real_loss=1.253, discriminator_fake_loss=1.234, generator_loss=30.54, generator_mel_loss=19.92, generator_kl_loss=1.962, generator_dur_loss=1.471, generator_adv_loss=2.335, generator_feat_match_loss=4.844, over 1455.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:36:45,680 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 06:36:54,937 INFO [train.py:534] (3/4) Epoch 768, validation: discriminator_loss=2.375, discriminator_real_loss=1.125, discriminator_fake_loss=1.25, generator_loss=32.21, generator_mel_loss=20.8, generator_kl_loss=2.137, generator_dur_loss=1.47, generator_adv_loss=2.258, generator_feat_match_loss=5.54, over 100.00 samples. 2024-02-24 06:36:54,937 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 06:38:21,899 INFO [train.py:845] (3/4) Start epoch 769 2024-02-24 06:41:47,270 INFO [train.py:471] (3/4) Epoch 769, batch 34, global_batch_idx: 28450, batch size: 61, loss[discriminator_loss=2.385, discriminator_real_loss=1.253, discriminator_fake_loss=1.132, generator_loss=30.65, generator_mel_loss=19.75, generator_kl_loss=1.954, generator_dur_loss=1.463, generator_adv_loss=2.342, generator_feat_match_loss=5.133, over 61.00 samples.], tot_loss[discriminator_loss=2.452, discriminator_real_loss=1.236, discriminator_fake_loss=1.216, generator_loss=30.66, generator_mel_loss=19.69, generator_kl_loss=1.935, generator_dur_loss=1.469, generator_adv_loss=2.43, generator_feat_match_loss=5.13, over 2444.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 4.0 2024-02-24 06:41:55,845 INFO [train.py:845] (3/4) Start epoch 770 2024-02-24 06:45:26,909 INFO [train.py:845] (3/4) Start epoch 771 2024-02-24 06:46:30,356 INFO [train.py:471] (3/4) Epoch 771, batch 10, global_batch_idx: 28500, batch size: 154, loss[discriminator_loss=2.434, discriminator_real_loss=1.258, discriminator_fake_loss=1.177, generator_loss=32.09, generator_mel_loss=20.91, generator_kl_loss=1.937, generator_dur_loss=1.463, generator_adv_loss=2.359, generator_feat_match_loss=5.426, over 154.00 samples.], tot_loss[discriminator_loss=2.566, discriminator_real_loss=1.309, discriminator_fake_loss=1.257, generator_loss=30.84, generator_mel_loss=20.43, generator_kl_loss=1.957, generator_dur_loss=1.482, generator_adv_loss=2.278, generator_feat_match_loss=4.693, over 713.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:48:58,702 INFO [train.py:845] (3/4) Start epoch 772 2024-02-24 06:51:20,950 INFO [train.py:471] (3/4) Epoch 772, batch 23, global_batch_idx: 28550, batch size: 54, loss[discriminator_loss=2.309, discriminator_real_loss=1.115, discriminator_fake_loss=1.193, generator_loss=30.51, generator_mel_loss=19.3, generator_kl_loss=1.935, generator_dur_loss=1.443, generator_adv_loss=2.555, generator_feat_match_loss=5.281, over 54.00 samples.], tot_loss[discriminator_loss=2.536, discriminator_real_loss=1.271, discriminator_fake_loss=1.265, generator_loss=30.45, generator_mel_loss=19.77, generator_kl_loss=1.928, generator_dur_loss=1.466, generator_adv_loss=2.329, generator_feat_match_loss=4.961, over 1954.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:52:27,882 INFO [train.py:845] (3/4) Start epoch 773 2024-02-24 06:56:02,127 INFO [train.py:471] (3/4) Epoch 773, batch 36, global_batch_idx: 28600, batch size: 67, loss[discriminator_loss=2.549, discriminator_real_loss=1.408, discriminator_fake_loss=1.141, generator_loss=31.22, generator_mel_loss=20.28, generator_kl_loss=1.998, generator_dur_loss=1.472, generator_adv_loss=2.441, generator_feat_match_loss=5.031, over 67.00 samples.], tot_loss[discriminator_loss=2.568, discriminator_real_loss=1.3, discriminator_fake_loss=1.268, generator_loss=30.28, generator_mel_loss=20.17, generator_kl_loss=1.952, generator_dur_loss=1.468, generator_adv_loss=2.177, generator_feat_match_loss=4.511, over 2754.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 06:56:02,129 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 06:56:10,325 INFO [train.py:534] (3/4) Epoch 773, validation: discriminator_loss=2.535, discriminator_real_loss=1.156, discriminator_fake_loss=1.379, generator_loss=31.03, generator_mel_loss=20.73, generator_kl_loss=2.099, generator_dur_loss=1.477, generator_adv_loss=1.989, generator_feat_match_loss=4.741, over 100.00 samples. 2024-02-24 06:56:10,326 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 06:56:10,747 INFO [train.py:845] (3/4) Start epoch 774 2024-02-24 06:59:41,293 INFO [train.py:845] (3/4) Start epoch 775 2024-02-24 07:00:54,843 INFO [train.py:471] (3/4) Epoch 775, batch 12, global_batch_idx: 28650, batch size: 90, loss[discriminator_loss=2.484, discriminator_real_loss=1.241, discriminator_fake_loss=1.244, generator_loss=30.05, generator_mel_loss=19.76, generator_kl_loss=1.966, generator_dur_loss=1.463, generator_adv_loss=2.383, generator_feat_match_loss=4.48, over 90.00 samples.], tot_loss[discriminator_loss=2.511, discriminator_real_loss=1.245, discriminator_fake_loss=1.266, generator_loss=30.26, generator_mel_loss=19.75, generator_kl_loss=1.903, generator_dur_loss=1.481, generator_adv_loss=2.331, generator_feat_match_loss=4.796, over 840.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 4.0 2024-02-24 07:03:11,922 INFO [train.py:845] (3/4) Start epoch 776 2024-02-24 07:05:41,371 INFO [train.py:471] (3/4) Epoch 776, batch 25, global_batch_idx: 28700, batch size: 55, loss[discriminator_loss=2.328, discriminator_real_loss=1.136, discriminator_fake_loss=1.192, generator_loss=30.91, generator_mel_loss=19.75, generator_kl_loss=1.923, generator_dur_loss=1.489, generator_adv_loss=2.57, generator_feat_match_loss=5.18, over 55.00 samples.], tot_loss[discriminator_loss=2.494, discriminator_real_loss=1.246, discriminator_fake_loss=1.248, generator_loss=30.8, generator_mel_loss=20.03, generator_kl_loss=1.961, generator_dur_loss=1.476, generator_adv_loss=2.362, generator_feat_match_loss=4.966, over 1707.00 samples.], cur_lr_g: 1.82e-04, cur_lr_d: 1.82e-04, grad_scale: 8.0 2024-02-24 07:06:39,834 INFO [train.py:845] (3/4) Start epoch 777 2024-02-24 07:10:13,360 INFO [train.py:845] (3/4) Start epoch 778 2024-02-24 07:10:30,480 INFO [train.py:471] (3/4) Epoch 778, batch 1, global_batch_idx: 28750, batch size: 73, loss[discriminator_loss=2.422, discriminator_real_loss=1.139, discriminator_fake_loss=1.284, generator_loss=30.08, generator_mel_loss=19.35, generator_kl_loss=1.969, generator_dur_loss=1.473, generator_adv_loss=2.449, generator_feat_match_loss=4.84, over 73.00 samples.], tot_loss[discriminator_loss=2.406, discriminator_real_loss=1.185, discriminator_fake_loss=1.221, generator_loss=30.44, generator_mel_loss=19.55, generator_kl_loss=2.013, generator_dur_loss=1.474, generator_adv_loss=2.353, generator_feat_match_loss=5.054, over 144.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 07:13:45,591 INFO [train.py:845] (3/4) Start epoch 779 2024-02-24 07:15:16,318 INFO [train.py:471] (3/4) Epoch 779, batch 14, global_batch_idx: 28800, batch size: 56, loss[discriminator_loss=2.559, discriminator_real_loss=1.236, discriminator_fake_loss=1.323, generator_loss=29.79, generator_mel_loss=19.87, generator_kl_loss=1.919, generator_dur_loss=1.5, generator_adv_loss=2.123, generator_feat_match_loss=4.371, over 56.00 samples.], tot_loss[discriminator_loss=2.526, discriminator_real_loss=1.27, discriminator_fake_loss=1.256, generator_loss=30.56, generator_mel_loss=20.03, generator_kl_loss=1.952, generator_dur_loss=1.474, generator_adv_loss=2.294, generator_feat_match_loss=4.811, over 1068.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 16.0 2024-02-24 07:15:16,319 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 07:15:25,586 INFO [train.py:534] (3/4) Epoch 779, validation: discriminator_loss=2.589, discriminator_real_loss=1.221, discriminator_fake_loss=1.368, generator_loss=30.77, generator_mel_loss=20.74, generator_kl_loss=2.139, generator_dur_loss=1.47, generator_adv_loss=2.005, generator_feat_match_loss=4.42, over 100.00 samples. 2024-02-24 07:15:25,587 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 07:17:17,166 INFO [train.py:845] (3/4) Start epoch 780 2024-02-24 07:19:47,135 INFO [train.py:471] (3/4) Epoch 780, batch 27, global_batch_idx: 28850, batch size: 52, loss[discriminator_loss=2.395, discriminator_real_loss=1.235, discriminator_fake_loss=1.159, generator_loss=30.94, generator_mel_loss=19.57, generator_kl_loss=1.918, generator_dur_loss=1.487, generator_adv_loss=2.445, generator_feat_match_loss=5.516, over 52.00 samples.], tot_loss[discriminator_loss=2.462, discriminator_real_loss=1.247, discriminator_fake_loss=1.215, generator_loss=30.56, generator_mel_loss=19.67, generator_kl_loss=1.946, generator_dur_loss=1.469, generator_adv_loss=2.396, generator_feat_match_loss=5.083, over 2039.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 4.0 2024-02-24 07:20:41,253 INFO [train.py:845] (3/4) Start epoch 781 2024-02-24 07:24:09,061 INFO [train.py:845] (3/4) Start epoch 782 2024-02-24 07:24:37,467 INFO [train.py:471] (3/4) Epoch 782, batch 3, global_batch_idx: 28900, batch size: 50, loss[discriminator_loss=2.605, discriminator_real_loss=1.241, discriminator_fake_loss=1.364, generator_loss=29.69, generator_mel_loss=19.8, generator_kl_loss=1.899, generator_dur_loss=1.473, generator_adv_loss=2.213, generator_feat_match_loss=4.301, over 50.00 samples.], tot_loss[discriminator_loss=2.542, discriminator_real_loss=1.285, discriminator_fake_loss=1.257, generator_loss=30.22, generator_mel_loss=19.93, generator_kl_loss=1.969, generator_dur_loss=1.473, generator_adv_loss=2.189, generator_feat_match_loss=4.655, over 310.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 07:27:33,229 INFO [train.py:845] (3/4) Start epoch 783 2024-02-24 07:29:08,279 INFO [train.py:471] (3/4) Epoch 783, batch 16, global_batch_idx: 28950, batch size: 52, loss[discriminator_loss=2.553, discriminator_real_loss=1.345, discriminator_fake_loss=1.208, generator_loss=30.66, generator_mel_loss=19.98, generator_kl_loss=1.892, generator_dur_loss=1.481, generator_adv_loss=2.453, generator_feat_match_loss=4.855, over 52.00 samples.], tot_loss[discriminator_loss=2.544, discriminator_real_loss=1.281, discriminator_fake_loss=1.263, generator_loss=30.57, generator_mel_loss=20.03, generator_kl_loss=1.955, generator_dur_loss=1.468, generator_adv_loss=2.339, generator_feat_match_loss=4.77, over 1219.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 07:31:02,583 INFO [train.py:845] (3/4) Start epoch 784 2024-02-24 07:33:45,332 INFO [train.py:471] (3/4) Epoch 784, batch 29, global_batch_idx: 29000, batch size: 101, loss[discriminator_loss=2.512, discriminator_real_loss=1.133, discriminator_fake_loss=1.38, generator_loss=31.08, generator_mel_loss=20.18, generator_kl_loss=1.945, generator_dur_loss=1.472, generator_adv_loss=2.297, generator_feat_match_loss=5.188, over 101.00 samples.], tot_loss[discriminator_loss=2.452, discriminator_real_loss=1.217, discriminator_fake_loss=1.235, generator_loss=30.62, generator_mel_loss=19.74, generator_kl_loss=1.939, generator_dur_loss=1.465, generator_adv_loss=2.394, generator_feat_match_loss=5.076, over 2440.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 07:33:45,334 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 07:33:53,668 INFO [train.py:534] (3/4) Epoch 784, validation: discriminator_loss=2.572, discriminator_real_loss=1.211, discriminator_fake_loss=1.361, generator_loss=30.38, generator_mel_loss=20.35, generator_kl_loss=2.043, generator_dur_loss=1.468, generator_adv_loss=1.891, generator_feat_match_loss=4.631, over 100.00 samples. 2024-02-24 07:33:53,669 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 07:34:35,699 INFO [train.py:845] (3/4) Start epoch 785 2024-02-24 07:38:01,834 INFO [train.py:845] (3/4) Start epoch 786 2024-02-24 07:38:41,873 INFO [train.py:471] (3/4) Epoch 786, batch 5, global_batch_idx: 29050, batch size: 55, loss[discriminator_loss=2.754, discriminator_real_loss=1.299, discriminator_fake_loss=1.454, generator_loss=29.26, generator_mel_loss=19.56, generator_kl_loss=1.887, generator_dur_loss=1.477, generator_adv_loss=2.115, generator_feat_match_loss=4.215, over 55.00 samples.], tot_loss[discriminator_loss=2.468, discriminator_real_loss=1.234, discriminator_fake_loss=1.233, generator_loss=30.57, generator_mel_loss=19.87, generator_kl_loss=1.889, generator_dur_loss=1.466, generator_adv_loss=2.339, generator_feat_match_loss=5, over 524.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 4.0 2024-02-24 07:41:29,548 INFO [train.py:845] (3/4) Start epoch 787 2024-02-24 07:43:17,689 INFO [train.py:471] (3/4) Epoch 787, batch 18, global_batch_idx: 29100, batch size: 52, loss[discriminator_loss=2.68, discriminator_real_loss=1.178, discriminator_fake_loss=1.502, generator_loss=30.17, generator_mel_loss=20.05, generator_kl_loss=1.978, generator_dur_loss=1.456, generator_adv_loss=2.24, generator_feat_match_loss=4.449, over 52.00 samples.], tot_loss[discriminator_loss=2.482, discriminator_real_loss=1.238, discriminator_fake_loss=1.244, generator_loss=30.3, generator_mel_loss=19.62, generator_kl_loss=1.942, generator_dur_loss=1.477, generator_adv_loss=2.357, generator_feat_match_loss=4.904, over 1238.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 07:44:57,267 INFO [train.py:845] (3/4) Start epoch 788 2024-02-24 07:47:52,114 INFO [train.py:471] (3/4) Epoch 788, batch 31, global_batch_idx: 29150, batch size: 71, loss[discriminator_loss=2.504, discriminator_real_loss=1.312, discriminator_fake_loss=1.191, generator_loss=30.03, generator_mel_loss=19.95, generator_kl_loss=1.945, generator_dur_loss=1.471, generator_adv_loss=2.195, generator_feat_match_loss=4.461, over 71.00 samples.], tot_loss[discriminator_loss=2.559, discriminator_real_loss=1.301, discriminator_fake_loss=1.257, generator_loss=30.19, generator_mel_loss=20.04, generator_kl_loss=1.974, generator_dur_loss=1.472, generator_adv_loss=2.181, generator_feat_match_loss=4.521, over 2158.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 07:48:23,743 INFO [train.py:845] (3/4) Start epoch 789 2024-02-24 07:51:51,980 INFO [train.py:845] (3/4) Start epoch 790 2024-02-24 07:52:38,840 INFO [train.py:471] (3/4) Epoch 790, batch 7, global_batch_idx: 29200, batch size: 85, loss[discriminator_loss=2.498, discriminator_real_loss=1.383, discriminator_fake_loss=1.115, generator_loss=30.68, generator_mel_loss=19.56, generator_kl_loss=1.868, generator_dur_loss=1.461, generator_adv_loss=2.496, generator_feat_match_loss=5.301, over 85.00 samples.], tot_loss[discriminator_loss=2.522, discriminator_real_loss=1.294, discriminator_fake_loss=1.228, generator_loss=30.24, generator_mel_loss=19.48, generator_kl_loss=1.92, generator_dur_loss=1.47, generator_adv_loss=2.387, generator_feat_match_loss=4.98, over 647.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 07:52:38,842 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 07:52:47,617 INFO [train.py:534] (3/4) Epoch 790, validation: discriminator_loss=2.39, discriminator_real_loss=1.175, discriminator_fake_loss=1.215, generator_loss=31.85, generator_mel_loss=20.66, generator_kl_loss=2.072, generator_dur_loss=1.477, generator_adv_loss=2.236, generator_feat_match_loss=5.412, over 100.00 samples. 2024-02-24 07:52:47,618 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 07:55:33,562 INFO [train.py:845] (3/4) Start epoch 791 2024-02-24 07:57:35,637 INFO [train.py:471] (3/4) Epoch 791, batch 20, global_batch_idx: 29250, batch size: 69, loss[discriminator_loss=2.578, discriminator_real_loss=1.133, discriminator_fake_loss=1.445, generator_loss=29.93, generator_mel_loss=19.59, generator_kl_loss=1.849, generator_dur_loss=1.445, generator_adv_loss=2.197, generator_feat_match_loss=4.852, over 69.00 samples.], tot_loss[discriminator_loss=2.542, discriminator_real_loss=1.295, discriminator_fake_loss=1.248, generator_loss=30.49, generator_mel_loss=19.93, generator_kl_loss=1.963, generator_dur_loss=1.467, generator_adv_loss=2.302, generator_feat_match_loss=4.827, over 1570.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 07:59:06,494 INFO [train.py:845] (3/4) Start epoch 792 2024-02-24 08:02:24,096 INFO [train.py:471] (3/4) Epoch 792, batch 33, global_batch_idx: 29300, batch size: 55, loss[discriminator_loss=2.467, discriminator_real_loss=1.218, discriminator_fake_loss=1.249, generator_loss=30.81, generator_mel_loss=20.02, generator_kl_loss=1.943, generator_dur_loss=1.495, generator_adv_loss=2.242, generator_feat_match_loss=5.109, over 55.00 samples.], tot_loss[discriminator_loss=2.56, discriminator_real_loss=1.3, discriminator_fake_loss=1.26, generator_loss=30.08, generator_mel_loss=19.91, generator_kl_loss=1.964, generator_dur_loss=1.468, generator_adv_loss=2.176, generator_feat_match_loss=4.559, over 2316.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:02:41,661 INFO [train.py:845] (3/4) Start epoch 793 2024-02-24 08:06:04,635 INFO [train.py:845] (3/4) Start epoch 794 2024-02-24 08:07:02,512 INFO [train.py:471] (3/4) Epoch 794, batch 9, global_batch_idx: 29350, batch size: 50, loss[discriminator_loss=2.568, discriminator_real_loss=1.35, discriminator_fake_loss=1.219, generator_loss=29.9, generator_mel_loss=19.75, generator_kl_loss=1.877, generator_dur_loss=1.458, generator_adv_loss=2.355, generator_feat_match_loss=4.461, over 50.00 samples.], tot_loss[discriminator_loss=2.41, discriminator_real_loss=1.215, discriminator_fake_loss=1.194, generator_loss=30.54, generator_mel_loss=19.6, generator_kl_loss=1.942, generator_dur_loss=1.479, generator_adv_loss=2.423, generator_feat_match_loss=5.091, over 617.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:09:34,313 INFO [train.py:845] (3/4) Start epoch 795 2024-02-24 08:11:50,424 INFO [train.py:471] (3/4) Epoch 795, batch 22, global_batch_idx: 29400, batch size: 58, loss[discriminator_loss=2.566, discriminator_real_loss=1.309, discriminator_fake_loss=1.259, generator_loss=30.27, generator_mel_loss=19.76, generator_kl_loss=1.878, generator_dur_loss=1.444, generator_adv_loss=2.402, generator_feat_match_loss=4.785, over 58.00 samples.], tot_loss[discriminator_loss=2.525, discriminator_real_loss=1.28, discriminator_fake_loss=1.245, generator_loss=30.8, generator_mel_loss=20.24, generator_kl_loss=1.964, generator_dur_loss=1.47, generator_adv_loss=2.296, generator_feat_match_loss=4.831, over 1770.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:11:50,426 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 08:11:58,165 INFO [train.py:534] (3/4) Epoch 795, validation: discriminator_loss=2.464, discriminator_real_loss=1.165, discriminator_fake_loss=1.299, generator_loss=31.25, generator_mel_loss=20.94, generator_kl_loss=2.144, generator_dur_loss=1.465, generator_adv_loss=1.949, generator_feat_match_loss=4.756, over 100.00 samples. 2024-02-24 08:11:58,165 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 08:13:12,614 INFO [train.py:845] (3/4) Start epoch 796 2024-02-24 08:16:40,097 INFO [train.py:471] (3/4) Epoch 796, batch 35, global_batch_idx: 29450, batch size: 110, loss[discriminator_loss=2.375, discriminator_real_loss=1.318, discriminator_fake_loss=1.056, generator_loss=31.65, generator_mel_loss=20.22, generator_kl_loss=1.972, generator_dur_loss=1.47, generator_adv_loss=2.484, generator_feat_match_loss=5.5, over 110.00 samples.], tot_loss[discriminator_loss=2.496, discriminator_real_loss=1.267, discriminator_fake_loss=1.229, generator_loss=30.47, generator_mel_loss=19.66, generator_kl_loss=1.923, generator_dur_loss=1.471, generator_adv_loss=2.389, generator_feat_match_loss=5.028, over 2801.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:16:46,346 INFO [train.py:845] (3/4) Start epoch 797 2024-02-24 08:20:15,876 INFO [train.py:845] (3/4) Start epoch 798 2024-02-24 08:21:30,783 INFO [train.py:471] (3/4) Epoch 798, batch 11, global_batch_idx: 29500, batch size: 90, loss[discriminator_loss=2.502, discriminator_real_loss=1.25, discriminator_fake_loss=1.252, generator_loss=30.22, generator_mel_loss=19.86, generator_kl_loss=2.021, generator_dur_loss=1.444, generator_adv_loss=2.26, generator_feat_match_loss=4.633, over 90.00 samples.], tot_loss[discriminator_loss=2.54, discriminator_real_loss=1.28, discriminator_fake_loss=1.26, generator_loss=30.79, generator_mel_loss=20.14, generator_kl_loss=2.008, generator_dur_loss=1.468, generator_adv_loss=2.306, generator_feat_match_loss=4.864, over 871.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:23:44,868 INFO [train.py:845] (3/4) Start epoch 799 2024-02-24 08:26:02,720 INFO [train.py:471] (3/4) Epoch 799, batch 24, global_batch_idx: 29550, batch size: 51, loss[discriminator_loss=2.266, discriminator_real_loss=1.113, discriminator_fake_loss=1.152, generator_loss=30.87, generator_mel_loss=19.51, generator_kl_loss=1.98, generator_dur_loss=1.495, generator_adv_loss=2.486, generator_feat_match_loss=5.398, over 51.00 samples.], tot_loss[discriminator_loss=2.51, discriminator_real_loss=1.268, discriminator_fake_loss=1.242, generator_loss=30.68, generator_mel_loss=19.82, generator_kl_loss=1.947, generator_dur_loss=1.471, generator_adv_loss=2.401, generator_feat_match_loss=5.036, over 1810.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:27:10,647 INFO [train.py:845] (3/4) Start epoch 800 2024-02-24 08:30:35,324 INFO [train.py:845] (3/4) Start epoch 801 2024-02-24 08:30:48,723 INFO [train.py:471] (3/4) Epoch 801, batch 0, global_batch_idx: 29600, batch size: 126, loss[discriminator_loss=2.648, discriminator_real_loss=1.416, discriminator_fake_loss=1.232, generator_loss=30.64, generator_mel_loss=20.28, generator_kl_loss=1.978, generator_dur_loss=1.474, generator_adv_loss=2.238, generator_feat_match_loss=4.672, over 126.00 samples.], tot_loss[discriminator_loss=2.648, discriminator_real_loss=1.416, discriminator_fake_loss=1.232, generator_loss=30.64, generator_mel_loss=20.28, generator_kl_loss=1.978, generator_dur_loss=1.474, generator_adv_loss=2.238, generator_feat_match_loss=4.672, over 126.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:30:48,724 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 08:30:57,707 INFO [train.py:534] (3/4) Epoch 801, validation: discriminator_loss=2.63, discriminator_real_loss=1.412, discriminator_fake_loss=1.218, generator_loss=31.49, generator_mel_loss=20.82, generator_kl_loss=2.175, generator_dur_loss=1.469, generator_adv_loss=2.263, generator_feat_match_loss=4.765, over 100.00 samples. 2024-02-24 08:30:57,708 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 08:34:15,839 INFO [train.py:845] (3/4) Start epoch 802 2024-02-24 08:35:33,305 INFO [train.py:471] (3/4) Epoch 802, batch 13, global_batch_idx: 29650, batch size: 126, loss[discriminator_loss=2.566, discriminator_real_loss=1.222, discriminator_fake_loss=1.345, generator_loss=30.31, generator_mel_loss=20.11, generator_kl_loss=2.041, generator_dur_loss=1.463, generator_adv_loss=2.104, generator_feat_match_loss=4.59, over 126.00 samples.], tot_loss[discriminator_loss=2.571, discriminator_real_loss=1.277, discriminator_fake_loss=1.294, generator_loss=29.94, generator_mel_loss=19.9, generator_kl_loss=1.959, generator_dur_loss=1.472, generator_adv_loss=2.135, generator_feat_match_loss=4.473, over 1062.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:37:49,168 INFO [train.py:845] (3/4) Start epoch 803 2024-02-24 08:40:19,931 INFO [train.py:471] (3/4) Epoch 803, batch 26, global_batch_idx: 29700, batch size: 52, loss[discriminator_loss=2.594, discriminator_real_loss=1.487, discriminator_fake_loss=1.105, generator_loss=30.76, generator_mel_loss=19.91, generator_kl_loss=1.879, generator_dur_loss=1.462, generator_adv_loss=2.43, generator_feat_match_loss=5.086, over 52.00 samples.], tot_loss[discriminator_loss=2.62, discriminator_real_loss=1.343, discriminator_fake_loss=1.277, generator_loss=30.3, generator_mel_loss=19.99, generator_kl_loss=1.964, generator_dur_loss=1.471, generator_adv_loss=2.24, generator_feat_match_loss=4.634, over 1960.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:41:20,644 INFO [train.py:845] (3/4) Start epoch 804 2024-02-24 08:44:51,223 INFO [train.py:845] (3/4) Start epoch 805 2024-02-24 08:45:18,317 INFO [train.py:471] (3/4) Epoch 805, batch 2, global_batch_idx: 29750, batch size: 53, loss[discriminator_loss=2.605, discriminator_real_loss=1.435, discriminator_fake_loss=1.17, generator_loss=30.24, generator_mel_loss=19.55, generator_kl_loss=1.989, generator_dur_loss=1.49, generator_adv_loss=2.521, generator_feat_match_loss=4.691, over 53.00 samples.], tot_loss[discriminator_loss=2.641, discriminator_real_loss=1.236, discriminator_fake_loss=1.405, generator_loss=30.02, generator_mel_loss=19.66, generator_kl_loss=2.008, generator_dur_loss=1.477, generator_adv_loss=2.245, generator_feat_match_loss=4.627, over 274.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:48:23,048 INFO [train.py:845] (3/4) Start epoch 806 2024-02-24 08:49:53,356 INFO [train.py:471] (3/4) Epoch 806, batch 15, global_batch_idx: 29800, batch size: 64, loss[discriminator_loss=2.68, discriminator_real_loss=1.312, discriminator_fake_loss=1.369, generator_loss=29.82, generator_mel_loss=19.77, generator_kl_loss=1.89, generator_dur_loss=1.464, generator_adv_loss=2.229, generator_feat_match_loss=4.473, over 64.00 samples.], tot_loss[discriminator_loss=2.546, discriminator_real_loss=1.282, discriminator_fake_loss=1.264, generator_loss=30.71, generator_mel_loss=20.07, generator_kl_loss=1.957, generator_dur_loss=1.476, generator_adv_loss=2.326, generator_feat_match_loss=4.882, over 1060.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 08:49:53,357 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 08:50:01,620 INFO [train.py:534] (3/4) Epoch 806, validation: discriminator_loss=2.508, discriminator_real_loss=1.351, discriminator_fake_loss=1.156, generator_loss=31.44, generator_mel_loss=20.76, generator_kl_loss=2.087, generator_dur_loss=1.473, generator_adv_loss=2.278, generator_feat_match_loss=4.843, over 100.00 samples. 2024-02-24 08:50:01,621 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 08:51:55,356 INFO [train.py:845] (3/4) Start epoch 807 2024-02-24 08:54:37,103 INFO [train.py:471] (3/4) Epoch 807, batch 28, global_batch_idx: 29850, batch size: 59, loss[discriminator_loss=2.445, discriminator_real_loss=1.232, discriminator_fake_loss=1.214, generator_loss=30.24, generator_mel_loss=19.5, generator_kl_loss=1.932, generator_dur_loss=1.483, generator_adv_loss=2.34, generator_feat_match_loss=4.988, over 59.00 samples.], tot_loss[discriminator_loss=2.539, discriminator_real_loss=1.284, discriminator_fake_loss=1.255, generator_loss=30.42, generator_mel_loss=19.94, generator_kl_loss=1.963, generator_dur_loss=1.471, generator_adv_loss=2.295, generator_feat_match_loss=4.748, over 2041.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 4.0 2024-02-24 08:55:27,445 INFO [train.py:845] (3/4) Start epoch 808 2024-02-24 08:59:02,454 INFO [train.py:845] (3/4) Start epoch 809 2024-02-24 08:59:34,171 INFO [train.py:471] (3/4) Epoch 809, batch 4, global_batch_idx: 29900, batch size: 53, loss[discriminator_loss=2.422, discriminator_real_loss=1.199, discriminator_fake_loss=1.224, generator_loss=30.4, generator_mel_loss=19.9, generator_kl_loss=2.014, generator_dur_loss=1.476, generator_adv_loss=2.297, generator_feat_match_loss=4.711, over 53.00 samples.], tot_loss[discriminator_loss=2.52, discriminator_real_loss=1.28, discriminator_fake_loss=1.241, generator_loss=29.95, generator_mel_loss=19.7, generator_kl_loss=1.939, generator_dur_loss=1.475, generator_adv_loss=2.216, generator_feat_match_loss=4.618, over 334.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 09:02:31,980 INFO [train.py:845] (3/4) Start epoch 810 2024-02-24 09:04:19,997 INFO [train.py:471] (3/4) Epoch 810, batch 17, global_batch_idx: 29950, batch size: 69, loss[discriminator_loss=2.68, discriminator_real_loss=1.233, discriminator_fake_loss=1.447, generator_loss=29.92, generator_mel_loss=19.96, generator_kl_loss=1.896, generator_dur_loss=1.46, generator_adv_loss=2.238, generator_feat_match_loss=4.359, over 69.00 samples.], tot_loss[discriminator_loss=2.558, discriminator_real_loss=1.288, discriminator_fake_loss=1.27, generator_loss=30.53, generator_mel_loss=19.99, generator_kl_loss=1.934, generator_dur_loss=1.468, generator_adv_loss=2.293, generator_feat_match_loss=4.842, over 1363.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 4.0 2024-02-24 09:06:00,996 INFO [train.py:845] (3/4) Start epoch 811 2024-02-24 09:08:49,489 INFO [train.py:471] (3/4) Epoch 811, batch 30, global_batch_idx: 30000, batch size: 69, loss[discriminator_loss=2.469, discriminator_real_loss=1.266, discriminator_fake_loss=1.204, generator_loss=30.92, generator_mel_loss=19.95, generator_kl_loss=1.974, generator_dur_loss=1.474, generator_adv_loss=2.459, generator_feat_match_loss=5.059, over 69.00 samples.], tot_loss[discriminator_loss=2.532, discriminator_real_loss=1.266, discriminator_fake_loss=1.265, generator_loss=30.35, generator_mel_loss=20.01, generator_kl_loss=1.964, generator_dur_loss=1.47, generator_adv_loss=2.22, generator_feat_match_loss=4.687, over 2380.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 09:08:49,490 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 09:08:58,144 INFO [train.py:534] (3/4) Epoch 811, validation: discriminator_loss=2.531, discriminator_real_loss=1.278, discriminator_fake_loss=1.254, generator_loss=31.02, generator_mel_loss=20.56, generator_kl_loss=2.083, generator_dur_loss=1.477, generator_adv_loss=2.036, generator_feat_match_loss=4.86, over 100.00 samples. 2024-02-24 09:08:58,145 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 09:09:34,949 INFO [train.py:845] (3/4) Start epoch 812 2024-02-24 09:13:07,964 INFO [train.py:845] (3/4) Start epoch 813 2024-02-24 09:13:51,452 INFO [train.py:471] (3/4) Epoch 813, batch 6, global_batch_idx: 30050, batch size: 71, loss[discriminator_loss=2.379, discriminator_real_loss=1.098, discriminator_fake_loss=1.28, generator_loss=31.2, generator_mel_loss=19.8, generator_kl_loss=1.975, generator_dur_loss=1.488, generator_adv_loss=2.432, generator_feat_match_loss=5.5, over 71.00 samples.], tot_loss[discriminator_loss=2.328, discriminator_real_loss=1.183, discriminator_fake_loss=1.145, generator_loss=31.31, generator_mel_loss=19.69, generator_kl_loss=1.959, generator_dur_loss=1.476, generator_adv_loss=2.522, generator_feat_match_loss=5.666, over 477.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 09:16:36,640 INFO [train.py:845] (3/4) Start epoch 814 2024-02-24 09:18:39,720 INFO [train.py:471] (3/4) Epoch 814, batch 19, global_batch_idx: 30100, batch size: 69, loss[discriminator_loss=2.469, discriminator_real_loss=1.267, discriminator_fake_loss=1.201, generator_loss=29.97, generator_mel_loss=19.74, generator_kl_loss=1.941, generator_dur_loss=1.47, generator_adv_loss=2.236, generator_feat_match_loss=4.578, over 69.00 samples.], tot_loss[discriminator_loss=2.505, discriminator_real_loss=1.274, discriminator_fake_loss=1.23, generator_loss=30.49, generator_mel_loss=19.95, generator_kl_loss=1.967, generator_dur_loss=1.472, generator_adv_loss=2.274, generator_feat_match_loss=4.821, over 1356.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 09:20:02,578 INFO [train.py:845] (3/4) Start epoch 815 2024-02-24 09:23:14,491 INFO [train.py:471] (3/4) Epoch 815, batch 32, global_batch_idx: 30150, batch size: 58, loss[discriminator_loss=2.535, discriminator_real_loss=1.363, discriminator_fake_loss=1.172, generator_loss=30.64, generator_mel_loss=19.44, generator_kl_loss=1.906, generator_dur_loss=1.482, generator_adv_loss=2.496, generator_feat_match_loss=5.312, over 58.00 samples.], tot_loss[discriminator_loss=2.546, discriminator_real_loss=1.305, discriminator_fake_loss=1.241, generator_loss=30.56, generator_mel_loss=19.75, generator_kl_loss=1.988, generator_dur_loss=1.463, generator_adv_loss=2.39, generator_feat_match_loss=4.97, over 2660.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 4.0 2024-02-24 09:23:31,410 INFO [train.py:845] (3/4) Start epoch 816 2024-02-24 09:27:03,456 INFO [train.py:845] (3/4) Start epoch 817 2024-02-24 09:28:01,957 INFO [train.py:471] (3/4) Epoch 817, batch 8, global_batch_idx: 30200, batch size: 126, loss[discriminator_loss=2.562, discriminator_real_loss=1.358, discriminator_fake_loss=1.204, generator_loss=30.21, generator_mel_loss=19.93, generator_kl_loss=1.904, generator_dur_loss=1.467, generator_adv_loss=2.219, generator_feat_match_loss=4.691, over 126.00 samples.], tot_loss[discriminator_loss=2.516, discriminator_real_loss=1.295, discriminator_fake_loss=1.221, generator_loss=30.19, generator_mel_loss=19.78, generator_kl_loss=1.941, generator_dur_loss=1.465, generator_adv_loss=2.276, generator_feat_match_loss=4.729, over 731.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 09:28:01,959 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 09:28:10,657 INFO [train.py:534] (3/4) Epoch 817, validation: discriminator_loss=2.487, discriminator_real_loss=1.243, discriminator_fake_loss=1.244, generator_loss=31.26, generator_mel_loss=20.23, generator_kl_loss=2.19, generator_dur_loss=1.469, generator_adv_loss=2.276, generator_feat_match_loss=5.096, over 100.00 samples. 2024-02-24 09:28:10,658 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 09:30:45,424 INFO [train.py:845] (3/4) Start epoch 818 2024-02-24 09:33:06,591 INFO [train.py:471] (3/4) Epoch 818, batch 21, global_batch_idx: 30250, batch size: 85, loss[discriminator_loss=2.57, discriminator_real_loss=1.188, discriminator_fake_loss=1.384, generator_loss=30.11, generator_mel_loss=19.56, generator_kl_loss=1.913, generator_dur_loss=1.48, generator_adv_loss=2.377, generator_feat_match_loss=4.777, over 85.00 samples.], tot_loss[discriminator_loss=2.418, discriminator_real_loss=1.214, discriminator_fake_loss=1.204, generator_loss=30.75, generator_mel_loss=19.66, generator_kl_loss=1.941, generator_dur_loss=1.471, generator_adv_loss=2.432, generator_feat_match_loss=5.249, over 1664.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 09:34:18,878 INFO [train.py:845] (3/4) Start epoch 819 2024-02-24 09:37:33,791 INFO [train.py:471] (3/4) Epoch 819, batch 34, global_batch_idx: 30300, batch size: 85, loss[discriminator_loss=2.586, discriminator_real_loss=1.237, discriminator_fake_loss=1.348, generator_loss=30, generator_mel_loss=19.74, generator_kl_loss=1.897, generator_dur_loss=1.463, generator_adv_loss=2.5, generator_feat_match_loss=4.406, over 85.00 samples.], tot_loss[discriminator_loss=2.493, discriminator_real_loss=1.252, discriminator_fake_loss=1.241, generator_loss=30.53, generator_mel_loss=19.78, generator_kl_loss=1.968, generator_dur_loss=1.471, generator_adv_loss=2.351, generator_feat_match_loss=4.958, over 2456.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 8.0 2024-02-24 09:37:48,864 INFO [train.py:845] (3/4) Start epoch 820 2024-02-24 09:41:12,469 INFO [train.py:845] (3/4) Start epoch 821 2024-02-24 09:42:18,878 INFO [train.py:471] (3/4) Epoch 821, batch 10, global_batch_idx: 30350, batch size: 79, loss[discriminator_loss=2.531, discriminator_real_loss=1.335, discriminator_fake_loss=1.195, generator_loss=30.9, generator_mel_loss=20.03, generator_kl_loss=2.021, generator_dur_loss=1.475, generator_adv_loss=2.271, generator_feat_match_loss=5.102, over 79.00 samples.], tot_loss[discriminator_loss=2.506, discriminator_real_loss=1.264, discriminator_fake_loss=1.242, generator_loss=30.38, generator_mel_loss=19.87, generator_kl_loss=1.971, generator_dur_loss=1.467, generator_adv_loss=2.309, generator_feat_match_loss=4.766, over 830.00 samples.], cur_lr_g: 1.81e-04, cur_lr_d: 1.81e-04, grad_scale: 4.0 2024-02-24 09:44:38,482 INFO [train.py:845] (3/4) Start epoch 822 2024-02-24 09:46:59,031 INFO [train.py:471] (3/4) Epoch 822, batch 23, global_batch_idx: 30400, batch size: 110, loss[discriminator_loss=2.279, discriminator_real_loss=1.229, discriminator_fake_loss=1.051, generator_loss=31.83, generator_mel_loss=19.96, generator_kl_loss=2.064, generator_dur_loss=1.465, generator_adv_loss=2.574, generator_feat_match_loss=5.766, over 110.00 samples.], tot_loss[discriminator_loss=2.471, discriminator_real_loss=1.253, discriminator_fake_loss=1.219, generator_loss=30.73, generator_mel_loss=19.7, generator_kl_loss=1.951, generator_dur_loss=1.472, generator_adv_loss=2.434, generator_feat_match_loss=5.177, over 1771.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 09:46:59,033 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 09:47:07,640 INFO [train.py:534] (3/4) Epoch 822, validation: discriminator_loss=2.225, discriminator_real_loss=1.095, discriminator_fake_loss=1.13, generator_loss=31.59, generator_mel_loss=19.82, generator_kl_loss=2.032, generator_dur_loss=1.473, generator_adv_loss=2.522, generator_feat_match_loss=5.739, over 100.00 samples. 2024-02-24 09:47:07,640 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 09:48:15,893 INFO [train.py:845] (3/4) Start epoch 823 2024-02-24 09:51:47,182 INFO [train.py:471] (3/4) Epoch 823, batch 36, global_batch_idx: 30450, batch size: 60, loss[discriminator_loss=2.508, discriminator_real_loss=1.284, discriminator_fake_loss=1.223, generator_loss=31.04, generator_mel_loss=20.06, generator_kl_loss=2.005, generator_dur_loss=1.476, generator_adv_loss=2.449, generator_feat_match_loss=5.043, over 60.00 samples.], tot_loss[discriminator_loss=2.518, discriminator_real_loss=1.279, discriminator_fake_loss=1.239, generator_loss=30.25, generator_mel_loss=19.78, generator_kl_loss=1.951, generator_dur_loss=1.467, generator_adv_loss=2.28, generator_feat_match_loss=4.776, over 2828.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 4.0 2024-02-24 09:51:47,591 INFO [train.py:845] (3/4) Start epoch 824 2024-02-24 09:55:08,948 INFO [train.py:845] (3/4) Start epoch 825 2024-02-24 09:56:30,609 INFO [train.py:471] (3/4) Epoch 825, batch 12, global_batch_idx: 30500, batch size: 85, loss[discriminator_loss=2.494, discriminator_real_loss=1.143, discriminator_fake_loss=1.352, generator_loss=30.77, generator_mel_loss=19.85, generator_kl_loss=1.937, generator_dur_loss=1.461, generator_adv_loss=2.383, generator_feat_match_loss=5.141, over 85.00 samples.], tot_loss[discriminator_loss=2.539, discriminator_real_loss=1.264, discriminator_fake_loss=1.275, generator_loss=30.82, generator_mel_loss=20.04, generator_kl_loss=1.953, generator_dur_loss=1.471, generator_adv_loss=2.358, generator_feat_match_loss=4.994, over 1057.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 09:58:36,844 INFO [train.py:845] (3/4) Start epoch 826 2024-02-24 10:01:06,116 INFO [train.py:471] (3/4) Epoch 826, batch 25, global_batch_idx: 30550, batch size: 61, loss[discriminator_loss=2.746, discriminator_real_loss=1.177, discriminator_fake_loss=1.568, generator_loss=29.96, generator_mel_loss=19.47, generator_kl_loss=2.055, generator_dur_loss=1.464, generator_adv_loss=2.09, generator_feat_match_loss=4.875, over 61.00 samples.], tot_loss[discriminator_loss=2.428, discriminator_real_loss=1.205, discriminator_fake_loss=1.224, generator_loss=30.67, generator_mel_loss=19.53, generator_kl_loss=1.947, generator_dur_loss=1.472, generator_adv_loss=2.436, generator_feat_match_loss=5.286, over 1870.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:02:01,479 INFO [train.py:845] (3/4) Start epoch 827 2024-02-24 10:05:36,446 INFO [train.py:845] (3/4) Start epoch 828 2024-02-24 10:05:56,262 INFO [train.py:471] (3/4) Epoch 828, batch 1, global_batch_idx: 30600, batch size: 64, loss[discriminator_loss=2.492, discriminator_real_loss=1.16, discriminator_fake_loss=1.333, generator_loss=30.61, generator_mel_loss=20.04, generator_kl_loss=1.964, generator_dur_loss=1.458, generator_adv_loss=2.297, generator_feat_match_loss=4.859, over 64.00 samples.], tot_loss[discriminator_loss=2.46, discriminator_real_loss=1.183, discriminator_fake_loss=1.278, generator_loss=30.66, generator_mel_loss=19.99, generator_kl_loss=1.945, generator_dur_loss=1.463, generator_adv_loss=2.316, generator_feat_match_loss=4.945, over 128.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:05:56,263 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 10:06:05,470 INFO [train.py:534] (3/4) Epoch 828, validation: discriminator_loss=2.618, discriminator_real_loss=1.27, discriminator_fake_loss=1.348, generator_loss=30.23, generator_mel_loss=20.33, generator_kl_loss=2.141, generator_dur_loss=1.474, generator_adv_loss=1.9, generator_feat_match_loss=4.378, over 100.00 samples. 2024-02-24 10:06:05,471 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 10:09:12,727 INFO [train.py:845] (3/4) Start epoch 829 2024-02-24 10:10:44,241 INFO [train.py:471] (3/4) Epoch 829, batch 14, global_batch_idx: 30650, batch size: 69, loss[discriminator_loss=2.717, discriminator_real_loss=1.353, discriminator_fake_loss=1.364, generator_loss=29.61, generator_mel_loss=19.56, generator_kl_loss=1.898, generator_dur_loss=1.472, generator_adv_loss=2.162, generator_feat_match_loss=4.52, over 69.00 samples.], tot_loss[discriminator_loss=2.435, discriminator_real_loss=1.225, discriminator_fake_loss=1.211, generator_loss=30.54, generator_mel_loss=19.52, generator_kl_loss=1.919, generator_dur_loss=1.472, generator_adv_loss=2.432, generator_feat_match_loss=5.193, over 1150.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:12:43,117 INFO [train.py:845] (3/4) Start epoch 830 2024-02-24 10:15:20,159 INFO [train.py:471] (3/4) Epoch 830, batch 27, global_batch_idx: 30700, batch size: 101, loss[discriminator_loss=2.375, discriminator_real_loss=1.253, discriminator_fake_loss=1.122, generator_loss=30.93, generator_mel_loss=19.74, generator_kl_loss=2.119, generator_dur_loss=1.456, generator_adv_loss=2.383, generator_feat_match_loss=5.23, over 101.00 samples.], tot_loss[discriminator_loss=2.436, discriminator_real_loss=1.229, discriminator_fake_loss=1.207, generator_loss=30.6, generator_mel_loss=19.61, generator_kl_loss=1.966, generator_dur_loss=1.471, generator_adv_loss=2.409, generator_feat_match_loss=5.143, over 1990.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:16:13,007 INFO [train.py:845] (3/4) Start epoch 831 2024-02-24 10:19:46,445 INFO [train.py:845] (3/4) Start epoch 832 2024-02-24 10:20:17,848 INFO [train.py:471] (3/4) Epoch 832, batch 3, global_batch_idx: 30750, batch size: 52, loss[discriminator_loss=2.354, discriminator_real_loss=1.176, discriminator_fake_loss=1.178, generator_loss=31.37, generator_mel_loss=20.01, generator_kl_loss=1.922, generator_dur_loss=1.485, generator_adv_loss=2.484, generator_feat_match_loss=5.469, over 52.00 samples.], tot_loss[discriminator_loss=2.403, discriminator_real_loss=1.211, discriminator_fake_loss=1.191, generator_loss=30.88, generator_mel_loss=19.82, generator_kl_loss=1.982, generator_dur_loss=1.473, generator_adv_loss=2.425, generator_feat_match_loss=5.183, over 224.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 4.0 2024-02-24 10:23:11,672 INFO [train.py:845] (3/4) Start epoch 833 2024-02-24 10:24:47,367 INFO [train.py:471] (3/4) Epoch 833, batch 16, global_batch_idx: 30800, batch size: 52, loss[discriminator_loss=2.477, discriminator_real_loss=1.253, discriminator_fake_loss=1.224, generator_loss=30.6, generator_mel_loss=19.91, generator_kl_loss=2.018, generator_dur_loss=1.479, generator_adv_loss=2.371, generator_feat_match_loss=4.82, over 52.00 samples.], tot_loss[discriminator_loss=2.465, discriminator_real_loss=1.23, discriminator_fake_loss=1.235, generator_loss=30.26, generator_mel_loss=19.5, generator_kl_loss=1.963, generator_dur_loss=1.465, generator_adv_loss=2.346, generator_feat_match_loss=4.995, over 1301.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:24:47,369 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 10:24:56,355 INFO [train.py:534] (3/4) Epoch 833, validation: discriminator_loss=2.413, discriminator_real_loss=1.246, discriminator_fake_loss=1.168, generator_loss=31.07, generator_mel_loss=20.27, generator_kl_loss=2.041, generator_dur_loss=1.467, generator_adv_loss=2.25, generator_feat_match_loss=5.05, over 100.00 samples. 2024-02-24 10:24:56,356 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 10:26:45,238 INFO [train.py:845] (3/4) Start epoch 834 2024-02-24 10:29:26,790 INFO [train.py:471] (3/4) Epoch 834, batch 29, global_batch_idx: 30850, batch size: 73, loss[discriminator_loss=2.387, discriminator_real_loss=1.156, discriminator_fake_loss=1.229, generator_loss=30.91, generator_mel_loss=19.52, generator_kl_loss=1.853, generator_dur_loss=1.478, generator_adv_loss=2.441, generator_feat_match_loss=5.617, over 73.00 samples.], tot_loss[discriminator_loss=2.449, discriminator_real_loss=1.241, discriminator_fake_loss=1.208, generator_loss=30.84, generator_mel_loss=19.83, generator_kl_loss=1.943, generator_dur_loss=1.473, generator_adv_loss=2.42, generator_feat_match_loss=5.169, over 2091.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 4.0 2024-02-24 10:30:06,143 INFO [train.py:845] (3/4) Start epoch 835 2024-02-24 10:33:33,596 INFO [train.py:845] (3/4) Start epoch 836 2024-02-24 10:34:07,435 INFO [train.py:471] (3/4) Epoch 836, batch 5, global_batch_idx: 30900, batch size: 53, loss[discriminator_loss=2.988, discriminator_real_loss=1.228, discriminator_fake_loss=1.762, generator_loss=28.59, generator_mel_loss=19.14, generator_kl_loss=1.919, generator_dur_loss=1.467, generator_adv_loss=1.868, generator_feat_match_loss=4.191, over 53.00 samples.], tot_loss[discriminator_loss=2.403, discriminator_real_loss=1.173, discriminator_fake_loss=1.23, generator_loss=30.42, generator_mel_loss=19.28, generator_kl_loss=1.994, generator_dur_loss=1.472, generator_adv_loss=2.401, generator_feat_match_loss=5.274, over 370.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:36:55,200 INFO [train.py:845] (3/4) Start epoch 837 2024-02-24 10:38:53,047 INFO [train.py:471] (3/4) Epoch 837, batch 18, global_batch_idx: 30950, batch size: 67, loss[discriminator_loss=2.43, discriminator_real_loss=1.284, discriminator_fake_loss=1.146, generator_loss=30.52, generator_mel_loss=19.25, generator_kl_loss=1.924, generator_dur_loss=1.486, generator_adv_loss=2.602, generator_feat_match_loss=5.258, over 67.00 samples.], tot_loss[discriminator_loss=2.42, discriminator_real_loss=1.217, discriminator_fake_loss=1.203, generator_loss=30.73, generator_mel_loss=19.44, generator_kl_loss=1.965, generator_dur_loss=1.466, generator_adv_loss=2.487, generator_feat_match_loss=5.375, over 1329.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:40:27,527 INFO [train.py:845] (3/4) Start epoch 838 2024-02-24 10:43:30,535 INFO [train.py:471] (3/4) Epoch 838, batch 31, global_batch_idx: 31000, batch size: 49, loss[discriminator_loss=2.238, discriminator_real_loss=1.136, discriminator_fake_loss=1.103, generator_loss=30.89, generator_mel_loss=19.15, generator_kl_loss=1.898, generator_dur_loss=1.48, generator_adv_loss=2.557, generator_feat_match_loss=5.797, over 49.00 samples.], tot_loss[discriminator_loss=2.376, discriminator_real_loss=1.203, discriminator_fake_loss=1.173, generator_loss=30.59, generator_mel_loss=19.32, generator_kl_loss=1.96, generator_dur_loss=1.47, generator_adv_loss=2.493, generator_feat_match_loss=5.346, over 2242.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:43:30,537 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 10:43:38,977 INFO [train.py:534] (3/4) Epoch 838, validation: discriminator_loss=2.224, discriminator_real_loss=1.015, discriminator_fake_loss=1.209, generator_loss=31.96, generator_mel_loss=19.92, generator_kl_loss=2.163, generator_dur_loss=1.475, generator_adv_loss=2.5, generator_feat_match_loss=5.904, over 100.00 samples. 2024-02-24 10:43:38,978 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 10:44:05,849 INFO [train.py:845] (3/4) Start epoch 839 2024-02-24 10:47:33,457 INFO [train.py:845] (3/4) Start epoch 840 2024-02-24 10:48:31,698 INFO [train.py:471] (3/4) Epoch 840, batch 7, global_batch_idx: 31050, batch size: 110, loss[discriminator_loss=2.695, discriminator_real_loss=1.634, discriminator_fake_loss=1.061, generator_loss=31.49, generator_mel_loss=20.22, generator_kl_loss=1.98, generator_dur_loss=1.466, generator_adv_loss=2.492, generator_feat_match_loss=5.336, over 110.00 samples.], tot_loss[discriminator_loss=2.536, discriminator_real_loss=1.336, discriminator_fake_loss=1.2, generator_loss=30.81, generator_mel_loss=19.92, generator_kl_loss=1.959, generator_dur_loss=1.464, generator_adv_loss=2.404, generator_feat_match_loss=5.056, over 730.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 4.0 2024-02-24 10:51:04,838 INFO [train.py:845] (3/4) Start epoch 841 2024-02-24 10:52:54,821 INFO [train.py:471] (3/4) Epoch 841, batch 20, global_batch_idx: 31100, batch size: 76, loss[discriminator_loss=2.402, discriminator_real_loss=1.232, discriminator_fake_loss=1.17, generator_loss=30.43, generator_mel_loss=19.51, generator_kl_loss=1.983, generator_dur_loss=1.485, generator_adv_loss=2.508, generator_feat_match_loss=4.945, over 76.00 samples.], tot_loss[discriminator_loss=2.357, discriminator_real_loss=1.19, discriminator_fake_loss=1.167, generator_loss=30.96, generator_mel_loss=19.51, generator_kl_loss=1.94, generator_dur_loss=1.473, generator_adv_loss=2.511, generator_feat_match_loss=5.53, over 1492.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:54:24,716 INFO [train.py:845] (3/4) Start epoch 842 2024-02-24 10:57:26,623 INFO [train.py:471] (3/4) Epoch 842, batch 33, global_batch_idx: 31150, batch size: 69, loss[discriminator_loss=2.314, discriminator_real_loss=1.139, discriminator_fake_loss=1.176, generator_loss=30.31, generator_mel_loss=19.19, generator_kl_loss=1.873, generator_dur_loss=1.46, generator_adv_loss=2.434, generator_feat_match_loss=5.355, over 69.00 samples.], tot_loss[discriminator_loss=2.353, discriminator_real_loss=1.194, discriminator_fake_loss=1.159, generator_loss=31, generator_mel_loss=19.45, generator_kl_loss=1.963, generator_dur_loss=1.466, generator_adv_loss=2.534, generator_feat_match_loss=5.584, over 2308.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 10:57:46,023 INFO [train.py:845] (3/4) Start epoch 843 2024-02-24 11:01:13,555 INFO [train.py:845] (3/4) Start epoch 844 2024-02-24 11:02:10,130 INFO [train.py:471] (3/4) Epoch 844, batch 9, global_batch_idx: 31200, batch size: 65, loss[discriminator_loss=2.387, discriminator_real_loss=1.27, discriminator_fake_loss=1.118, generator_loss=31.22, generator_mel_loss=19.65, generator_kl_loss=2.04, generator_dur_loss=1.457, generator_adv_loss=2.492, generator_feat_match_loss=5.574, over 65.00 samples.], tot_loss[discriminator_loss=2.422, discriminator_real_loss=1.214, discriminator_fake_loss=1.209, generator_loss=30.72, generator_mel_loss=19.46, generator_kl_loss=1.945, generator_dur_loss=1.46, generator_adv_loss=2.474, generator_feat_match_loss=5.38, over 749.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 16.0 2024-02-24 11:02:10,131 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 11:02:19,383 INFO [train.py:534] (3/4) Epoch 844, validation: discriminator_loss=2.289, discriminator_real_loss=1.157, discriminator_fake_loss=1.132, generator_loss=31.4, generator_mel_loss=19.84, generator_kl_loss=2.202, generator_dur_loss=1.473, generator_adv_loss=2.399, generator_feat_match_loss=5.487, over 100.00 samples. 2024-02-24 11:02:19,384 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 11:04:49,191 INFO [train.py:845] (3/4) Start epoch 845 2024-02-24 11:07:01,216 INFO [train.py:471] (3/4) Epoch 845, batch 22, global_batch_idx: 31250, batch size: 53, loss[discriminator_loss=2.436, discriminator_real_loss=1.167, discriminator_fake_loss=1.269, generator_loss=30.13, generator_mel_loss=19.45, generator_kl_loss=1.958, generator_dur_loss=1.481, generator_adv_loss=2.457, generator_feat_match_loss=4.781, over 53.00 samples.], tot_loss[discriminator_loss=2.512, discriminator_real_loss=1.262, discriminator_fake_loss=1.25, generator_loss=30.52, generator_mel_loss=19.73, generator_kl_loss=1.978, generator_dur_loss=1.47, generator_adv_loss=2.344, generator_feat_match_loss=5.008, over 1654.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:08:19,878 INFO [train.py:845] (3/4) Start epoch 846 2024-02-24 11:11:44,524 INFO [train.py:471] (3/4) Epoch 846, batch 35, global_batch_idx: 31300, batch size: 59, loss[discriminator_loss=2.445, discriminator_real_loss=1.363, discriminator_fake_loss=1.082, generator_loss=30.63, generator_mel_loss=19.73, generator_kl_loss=1.99, generator_dur_loss=1.483, generator_adv_loss=2.387, generator_feat_match_loss=5.047, over 59.00 samples.], tot_loss[discriminator_loss=2.485, discriminator_real_loss=1.25, discriminator_fake_loss=1.236, generator_loss=30.46, generator_mel_loss=19.71, generator_kl_loss=1.963, generator_dur_loss=1.472, generator_adv_loss=2.329, generator_feat_match_loss=4.987, over 2546.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:11:49,361 INFO [train.py:845] (3/4) Start epoch 847 2024-02-24 11:15:23,696 INFO [train.py:845] (3/4) Start epoch 848 2024-02-24 11:16:31,433 INFO [train.py:471] (3/4) Epoch 848, batch 11, global_batch_idx: 31350, batch size: 64, loss[discriminator_loss=2.289, discriminator_real_loss=1.069, discriminator_fake_loss=1.219, generator_loss=32.38, generator_mel_loss=19.89, generator_kl_loss=2.022, generator_dur_loss=1.475, generator_adv_loss=2.732, generator_feat_match_loss=6.262, over 64.00 samples.], tot_loss[discriminator_loss=2.415, discriminator_real_loss=1.208, discriminator_fake_loss=1.207, generator_loss=30.86, generator_mel_loss=19.42, generator_kl_loss=1.973, generator_dur_loss=1.472, generator_adv_loss=2.54, generator_feat_match_loss=5.454, over 836.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:18:52,087 INFO [train.py:845] (3/4) Start epoch 849 2024-02-24 11:21:17,422 INFO [train.py:471] (3/4) Epoch 849, batch 24, global_batch_idx: 31400, batch size: 126, loss[discriminator_loss=2.238, discriminator_real_loss=1.168, discriminator_fake_loss=1.071, generator_loss=31.08, generator_mel_loss=19.2, generator_kl_loss=1.944, generator_dur_loss=1.458, generator_adv_loss=2.498, generator_feat_match_loss=5.98, over 126.00 samples.], tot_loss[discriminator_loss=2.457, discriminator_real_loss=1.262, discriminator_fake_loss=1.195, generator_loss=30.73, generator_mel_loss=19.53, generator_kl_loss=1.972, generator_dur_loss=1.463, generator_adv_loss=2.434, generator_feat_match_loss=5.329, over 2108.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:21:17,424 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 11:21:25,997 INFO [train.py:534] (3/4) Epoch 849, validation: discriminator_loss=2.216, discriminator_real_loss=1.036, discriminator_fake_loss=1.18, generator_loss=32.39, generator_mel_loss=20.15, generator_kl_loss=2.091, generator_dur_loss=1.472, generator_adv_loss=2.431, generator_feat_match_loss=6.246, over 100.00 samples. 2024-02-24 11:21:25,998 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 11:22:32,184 INFO [train.py:845] (3/4) Start epoch 850 2024-02-24 11:25:57,267 INFO [train.py:845] (3/4) Start epoch 851 2024-02-24 11:26:10,197 INFO [train.py:471] (3/4) Epoch 851, batch 0, global_batch_idx: 31450, batch size: 65, loss[discriminator_loss=2.561, discriminator_real_loss=1.215, discriminator_fake_loss=1.346, generator_loss=30.26, generator_mel_loss=19.85, generator_kl_loss=1.917, generator_dur_loss=1.491, generator_adv_loss=2.137, generator_feat_match_loss=4.867, over 65.00 samples.], tot_loss[discriminator_loss=2.561, discriminator_real_loss=1.215, discriminator_fake_loss=1.346, generator_loss=30.26, generator_mel_loss=19.85, generator_kl_loss=1.917, generator_dur_loss=1.491, generator_adv_loss=2.137, generator_feat_match_loss=4.867, over 65.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:29:30,180 INFO [train.py:845] (3/4) Start epoch 852 2024-02-24 11:30:57,675 INFO [train.py:471] (3/4) Epoch 852, batch 13, global_batch_idx: 31500, batch size: 95, loss[discriminator_loss=2.828, discriminator_real_loss=1.193, discriminator_fake_loss=1.634, generator_loss=29.95, generator_mel_loss=19.5, generator_kl_loss=2.003, generator_dur_loss=1.466, generator_adv_loss=2.197, generator_feat_match_loss=4.789, over 95.00 samples.], tot_loss[discriminator_loss=2.374, discriminator_real_loss=1.189, discriminator_fake_loss=1.184, generator_loss=30.91, generator_mel_loss=19.42, generator_kl_loss=1.947, generator_dur_loss=1.464, generator_adv_loss=2.516, generator_feat_match_loss=5.559, over 1146.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:32:56,348 INFO [train.py:845] (3/4) Start epoch 853 2024-02-24 11:35:34,918 INFO [train.py:471] (3/4) Epoch 853, batch 26, global_batch_idx: 31550, batch size: 110, loss[discriminator_loss=2.465, discriminator_real_loss=1.369, discriminator_fake_loss=1.095, generator_loss=30.77, generator_mel_loss=19.56, generator_kl_loss=1.961, generator_dur_loss=1.455, generator_adv_loss=2.498, generator_feat_match_loss=5.289, over 110.00 samples.], tot_loss[discriminator_loss=2.38, discriminator_real_loss=1.19, discriminator_fake_loss=1.19, generator_loss=30.66, generator_mel_loss=19.33, generator_kl_loss=1.975, generator_dur_loss=1.467, generator_adv_loss=2.467, generator_feat_match_loss=5.425, over 2058.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:36:29,916 INFO [train.py:845] (3/4) Start epoch 854 2024-02-24 11:40:01,613 INFO [train.py:845] (3/4) Start epoch 855 2024-02-24 11:40:21,505 INFO [train.py:471] (3/4) Epoch 855, batch 2, global_batch_idx: 31600, batch size: 61, loss[discriminator_loss=2.258, discriminator_real_loss=1.178, discriminator_fake_loss=1.081, generator_loss=31.11, generator_mel_loss=19.41, generator_kl_loss=1.909, generator_dur_loss=1.467, generator_adv_loss=2.52, generator_feat_match_loss=5.801, over 61.00 samples.], tot_loss[discriminator_loss=2.293, discriminator_real_loss=1.169, discriminator_fake_loss=1.124, generator_loss=30.82, generator_mel_loss=19.22, generator_kl_loss=1.909, generator_dur_loss=1.469, generator_adv_loss=2.547, generator_feat_match_loss=5.673, over 195.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:40:21,506 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 11:40:30,300 INFO [train.py:534] (3/4) Epoch 855, validation: discriminator_loss=2.228, discriminator_real_loss=1.059, discriminator_fake_loss=1.169, generator_loss=31.7, generator_mel_loss=19.85, generator_kl_loss=2.182, generator_dur_loss=1.474, generator_adv_loss=2.45, generator_feat_match_loss=5.742, over 100.00 samples. 2024-02-24 11:40:30,301 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 11:43:38,876 INFO [train.py:845] (3/4) Start epoch 856 2024-02-24 11:45:10,767 INFO [train.py:471] (3/4) Epoch 856, batch 15, global_batch_idx: 31650, batch size: 101, loss[discriminator_loss=2.463, discriminator_real_loss=1.081, discriminator_fake_loss=1.382, generator_loss=30.21, generator_mel_loss=19.39, generator_kl_loss=1.92, generator_dur_loss=1.492, generator_adv_loss=2.312, generator_feat_match_loss=5.094, over 101.00 samples.], tot_loss[discriminator_loss=2.376, discriminator_real_loss=1.169, discriminator_fake_loss=1.208, generator_loss=30.69, generator_mel_loss=19.4, generator_kl_loss=1.937, generator_dur_loss=1.471, generator_adv_loss=2.452, generator_feat_match_loss=5.42, over 1198.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:47:09,151 INFO [train.py:845] (3/4) Start epoch 857 2024-02-24 11:49:48,315 INFO [train.py:471] (3/4) Epoch 857, batch 28, global_batch_idx: 31700, batch size: 76, loss[discriminator_loss=2.535, discriminator_real_loss=1.245, discriminator_fake_loss=1.291, generator_loss=29.9, generator_mel_loss=19.44, generator_kl_loss=2.033, generator_dur_loss=1.48, generator_adv_loss=2.168, generator_feat_match_loss=4.773, over 76.00 samples.], tot_loss[discriminator_loss=2.519, discriminator_real_loss=1.28, discriminator_fake_loss=1.239, generator_loss=30.57, generator_mel_loss=19.93, generator_kl_loss=1.967, generator_dur_loss=1.473, generator_adv_loss=2.319, generator_feat_match_loss=4.877, over 1978.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:50:34,430 INFO [train.py:845] (3/4) Start epoch 858 2024-02-24 11:54:05,692 INFO [train.py:845] (3/4) Start epoch 859 2024-02-24 11:54:40,042 INFO [train.py:471] (3/4) Epoch 859, batch 4, global_batch_idx: 31750, batch size: 69, loss[discriminator_loss=2.207, discriminator_real_loss=1.081, discriminator_fake_loss=1.127, generator_loss=31.51, generator_mel_loss=19.43, generator_kl_loss=1.996, generator_dur_loss=1.47, generator_adv_loss=2.711, generator_feat_match_loss=5.898, over 69.00 samples.], tot_loss[discriminator_loss=2.29, discriminator_real_loss=1.149, discriminator_fake_loss=1.142, generator_loss=30.94, generator_mel_loss=19.27, generator_kl_loss=1.988, generator_dur_loss=1.464, generator_adv_loss=2.586, generator_feat_match_loss=5.632, over 367.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:57:39,272 INFO [train.py:845] (3/4) Start epoch 860 2024-02-24 11:59:25,725 INFO [train.py:471] (3/4) Epoch 860, batch 17, global_batch_idx: 31800, batch size: 52, loss[discriminator_loss=2.441, discriminator_real_loss=1.321, discriminator_fake_loss=1.12, generator_loss=31.15, generator_mel_loss=19.56, generator_kl_loss=2.043, generator_dur_loss=1.497, generator_adv_loss=2.697, generator_feat_match_loss=5.359, over 52.00 samples.], tot_loss[discriminator_loss=2.372, discriminator_real_loss=1.18, discriminator_fake_loss=1.192, generator_loss=30.64, generator_mel_loss=19.33, generator_kl_loss=1.945, generator_dur_loss=1.469, generator_adv_loss=2.485, generator_feat_match_loss=5.411, over 1299.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 11:59:25,727 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 11:59:34,164 INFO [train.py:534] (3/4) Epoch 860, validation: discriminator_loss=2.437, discriminator_real_loss=1.299, discriminator_fake_loss=1.137, generator_loss=31.8, generator_mel_loss=20.22, generator_kl_loss=2.105, generator_dur_loss=1.469, generator_adv_loss=2.549, generator_feat_match_loss=5.452, over 100.00 samples. 2024-02-24 11:59:34,165 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 12:01:21,511 INFO [train.py:845] (3/4) Start epoch 861 2024-02-24 12:04:13,450 INFO [train.py:471] (3/4) Epoch 861, batch 30, global_batch_idx: 31850, batch size: 51, loss[discriminator_loss=2.484, discriminator_real_loss=1.288, discriminator_fake_loss=1.197, generator_loss=29.9, generator_mel_loss=19.26, generator_kl_loss=1.928, generator_dur_loss=1.501, generator_adv_loss=2.338, generator_feat_match_loss=4.879, over 51.00 samples.], tot_loss[discriminator_loss=2.414, discriminator_real_loss=1.213, discriminator_fake_loss=1.201, generator_loss=30.77, generator_mel_loss=19.61, generator_kl_loss=1.986, generator_dur_loss=1.466, generator_adv_loss=2.408, generator_feat_match_loss=5.306, over 2317.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 4.0 2024-02-24 12:04:52,919 INFO [train.py:845] (3/4) Start epoch 862 2024-02-24 12:08:20,374 INFO [train.py:845] (3/4) Start epoch 863 2024-02-24 12:09:04,561 INFO [train.py:471] (3/4) Epoch 863, batch 6, global_batch_idx: 31900, batch size: 50, loss[discriminator_loss=2.473, discriminator_real_loss=1.219, discriminator_fake_loss=1.253, generator_loss=30.53, generator_mel_loss=19.13, generator_kl_loss=1.999, generator_dur_loss=1.476, generator_adv_loss=2.549, generator_feat_match_loss=5.371, over 50.00 samples.], tot_loss[discriminator_loss=2.291, discriminator_real_loss=1.164, discriminator_fake_loss=1.127, generator_loss=31.2, generator_mel_loss=19.19, generator_kl_loss=1.951, generator_dur_loss=1.474, generator_adv_loss=2.686, generator_feat_match_loss=5.901, over 478.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 12:11:50,313 INFO [train.py:845] (3/4) Start epoch 864 2024-02-24 12:13:53,667 INFO [train.py:471] (3/4) Epoch 864, batch 19, global_batch_idx: 31950, batch size: 71, loss[discriminator_loss=2.555, discriminator_real_loss=1.359, discriminator_fake_loss=1.195, generator_loss=30.71, generator_mel_loss=19.72, generator_kl_loss=2.066, generator_dur_loss=1.458, generator_adv_loss=2.457, generator_feat_match_loss=5.012, over 71.00 samples.], tot_loss[discriminator_loss=2.487, discriminator_real_loss=1.26, discriminator_fake_loss=1.228, generator_loss=30.32, generator_mel_loss=19.43, generator_kl_loss=1.991, generator_dur_loss=1.46, generator_adv_loss=2.388, generator_feat_match_loss=5.057, over 1716.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 8.0 2024-02-24 12:15:20,400 INFO [train.py:845] (3/4) Start epoch 865 2024-02-24 12:18:28,327 INFO [train.py:471] (3/4) Epoch 865, batch 32, global_batch_idx: 32000, batch size: 126, loss[discriminator_loss=2.461, discriminator_real_loss=1.235, discriminator_fake_loss=1.226, generator_loss=30.48, generator_mel_loss=19.63, generator_kl_loss=2.002, generator_dur_loss=1.456, generator_adv_loss=2.492, generator_feat_match_loss=4.902, over 126.00 samples.], tot_loss[discriminator_loss=2.505, discriminator_real_loss=1.265, discriminator_fake_loss=1.24, generator_loss=30.47, generator_mel_loss=19.67, generator_kl_loss=1.976, generator_dur_loss=1.465, generator_adv_loss=2.354, generator_feat_match_loss=5.007, over 2647.00 samples.], cur_lr_g: 1.80e-04, cur_lr_d: 1.80e-04, grad_scale: 16.0 2024-02-24 12:18:28,329 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 12:18:37,269 INFO [train.py:534] (3/4) Epoch 865, validation: discriminator_loss=2.493, discriminator_real_loss=1.298, discriminator_fake_loss=1.196, generator_loss=31.81, generator_mel_loss=20.49, generator_kl_loss=2.266, generator_dur_loss=1.468, generator_adv_loss=2.409, generator_feat_match_loss=5.172, over 100.00 samples. 2024-02-24 12:18:37,269 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 12:18:57,664 INFO [train.py:845] (3/4) Start epoch 866 2024-02-24 12:22:29,447 INFO [train.py:845] (3/4) Start epoch 867 2024-02-24 12:23:30,335 INFO [train.py:471] (3/4) Epoch 867, batch 8, global_batch_idx: 32050, batch size: 58, loss[discriminator_loss=2.406, discriminator_real_loss=1.295, discriminator_fake_loss=1.111, generator_loss=31.07, generator_mel_loss=19.74, generator_kl_loss=1.895, generator_dur_loss=1.468, generator_adv_loss=2.572, generator_feat_match_loss=5.391, over 58.00 samples.], tot_loss[discriminator_loss=2.409, discriminator_real_loss=1.247, discriminator_fake_loss=1.163, generator_loss=31.05, generator_mel_loss=19.7, generator_kl_loss=1.988, generator_dur_loss=1.462, generator_adv_loss=2.499, generator_feat_match_loss=5.406, over 771.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 12:26:03,707 INFO [train.py:845] (3/4) Start epoch 868 2024-02-24 12:28:04,731 INFO [train.py:471] (3/4) Epoch 868, batch 21, global_batch_idx: 32100, batch size: 56, loss[discriminator_loss=2.426, discriminator_real_loss=1.312, discriminator_fake_loss=1.113, generator_loss=30.7, generator_mel_loss=19.17, generator_kl_loss=1.967, generator_dur_loss=1.48, generator_adv_loss=2.539, generator_feat_match_loss=5.547, over 56.00 samples.], tot_loss[discriminator_loss=2.424, discriminator_real_loss=1.237, discriminator_fake_loss=1.187, generator_loss=30.41, generator_mel_loss=19.35, generator_kl_loss=1.935, generator_dur_loss=1.468, generator_adv_loss=2.432, generator_feat_match_loss=5.225, over 1510.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 12:29:31,132 INFO [train.py:845] (3/4) Start epoch 869 2024-02-24 12:32:52,024 INFO [train.py:471] (3/4) Epoch 869, batch 34, global_batch_idx: 32150, batch size: 53, loss[discriminator_loss=2.363, discriminator_real_loss=1.177, discriminator_fake_loss=1.187, generator_loss=30.85, generator_mel_loss=19.2, generator_kl_loss=2.026, generator_dur_loss=1.489, generator_adv_loss=2.58, generator_feat_match_loss=5.551, over 53.00 samples.], tot_loss[discriminator_loss=2.352, discriminator_real_loss=1.193, discriminator_fake_loss=1.16, generator_loss=30.89, generator_mel_loss=19.3, generator_kl_loss=1.973, generator_dur_loss=1.466, generator_adv_loss=2.536, generator_feat_match_loss=5.614, over 2450.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 12:33:01,409 INFO [train.py:845] (3/4) Start epoch 870 2024-02-24 12:36:31,482 INFO [train.py:845] (3/4) Start epoch 871 2024-02-24 12:37:36,275 INFO [train.py:471] (3/4) Epoch 871, batch 10, global_batch_idx: 32200, batch size: 53, loss[discriminator_loss=2.275, discriminator_real_loss=1.191, discriminator_fake_loss=1.084, generator_loss=31.13, generator_mel_loss=19.11, generator_kl_loss=1.867, generator_dur_loss=1.49, generator_adv_loss=2.666, generator_feat_match_loss=5.992, over 53.00 samples.], tot_loss[discriminator_loss=2.357, discriminator_real_loss=1.183, discriminator_fake_loss=1.174, generator_loss=30.91, generator_mel_loss=19.2, generator_kl_loss=1.99, generator_dur_loss=1.466, generator_adv_loss=2.567, generator_feat_match_loss=5.687, over 760.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 12:37:36,277 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 12:37:44,153 INFO [train.py:534] (3/4) Epoch 871, validation: discriminator_loss=2.271, discriminator_real_loss=1.128, discriminator_fake_loss=1.143, generator_loss=32.55, generator_mel_loss=20.28, generator_kl_loss=2.134, generator_dur_loss=1.47, generator_adv_loss=2.526, generator_feat_match_loss=6.131, over 100.00 samples. 2024-02-24 12:37:44,155 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 12:40:05,903 INFO [train.py:845] (3/4) Start epoch 872 2024-02-24 12:42:19,815 INFO [train.py:471] (3/4) Epoch 872, batch 23, global_batch_idx: 32250, batch size: 73, loss[discriminator_loss=2.375, discriminator_real_loss=1.188, discriminator_fake_loss=1.188, generator_loss=30.35, generator_mel_loss=19.46, generator_kl_loss=1.894, generator_dur_loss=1.463, generator_adv_loss=2.33, generator_feat_match_loss=5.203, over 73.00 samples.], tot_loss[discriminator_loss=2.375, discriminator_real_loss=1.202, discriminator_fake_loss=1.173, generator_loss=30.72, generator_mel_loss=19.29, generator_kl_loss=1.972, generator_dur_loss=1.462, generator_adv_loss=2.496, generator_feat_match_loss=5.501, over 1884.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 12:43:28,849 INFO [train.py:845] (3/4) Start epoch 873 2024-02-24 12:46:55,247 INFO [train.py:471] (3/4) Epoch 873, batch 36, global_batch_idx: 32300, batch size: 76, loss[discriminator_loss=2.52, discriminator_real_loss=1.256, discriminator_fake_loss=1.265, generator_loss=30.69, generator_mel_loss=19.28, generator_kl_loss=2.021, generator_dur_loss=1.466, generator_adv_loss=2.477, generator_feat_match_loss=5.445, over 76.00 samples.], tot_loss[discriminator_loss=2.397, discriminator_real_loss=1.216, discriminator_fake_loss=1.181, generator_loss=30.79, generator_mel_loss=19.44, generator_kl_loss=1.962, generator_dur_loss=1.465, generator_adv_loss=2.491, generator_feat_match_loss=5.433, over 2776.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 12:46:55,721 INFO [train.py:845] (3/4) Start epoch 874 2024-02-24 12:50:29,904 INFO [train.py:845] (3/4) Start epoch 875 2024-02-24 12:51:56,397 INFO [train.py:471] (3/4) Epoch 875, batch 12, global_batch_idx: 32350, batch size: 69, loss[discriminator_loss=2.379, discriminator_real_loss=1.136, discriminator_fake_loss=1.242, generator_loss=30.56, generator_mel_loss=19.28, generator_kl_loss=1.983, generator_dur_loss=1.473, generator_adv_loss=2.402, generator_feat_match_loss=5.426, over 69.00 samples.], tot_loss[discriminator_loss=2.311, discriminator_real_loss=1.162, discriminator_fake_loss=1.148, generator_loss=30.73, generator_mel_loss=19.1, generator_kl_loss=1.954, generator_dur_loss=1.461, generator_adv_loss=2.517, generator_feat_match_loss=5.697, over 1002.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 12:53:53,756 INFO [train.py:845] (3/4) Start epoch 876 2024-02-24 12:56:19,043 INFO [train.py:471] (3/4) Epoch 876, batch 25, global_batch_idx: 32400, batch size: 153, loss[discriminator_loss=2.293, discriminator_real_loss=1.182, discriminator_fake_loss=1.111, generator_loss=31.5, generator_mel_loss=19.22, generator_kl_loss=1.854, generator_dur_loss=1.459, generator_adv_loss=2.646, generator_feat_match_loss=6.32, over 153.00 samples.], tot_loss[discriminator_loss=2.369, discriminator_real_loss=1.206, discriminator_fake_loss=1.163, generator_loss=31.18, generator_mel_loss=19.41, generator_kl_loss=1.951, generator_dur_loss=1.464, generator_adv_loss=2.56, generator_feat_match_loss=5.798, over 2123.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2024-02-24 12:56:19,049 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 12:56:27,646 INFO [train.py:534] (3/4) Epoch 876, validation: discriminator_loss=2.212, discriminator_real_loss=1.075, discriminator_fake_loss=1.137, generator_loss=31.86, generator_mel_loss=19.68, generator_kl_loss=2.09, generator_dur_loss=1.467, generator_adv_loss=2.54, generator_feat_match_loss=6.082, over 100.00 samples. 2024-02-24 12:56:27,647 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 12:57:31,945 INFO [train.py:845] (3/4) Start epoch 877 2024-02-24 13:01:02,930 INFO [train.py:845] (3/4) Start epoch 878 2024-02-24 13:01:20,803 INFO [train.py:471] (3/4) Epoch 878, batch 1, global_batch_idx: 32450, batch size: 64, loss[discriminator_loss=2.258, discriminator_real_loss=1.188, discriminator_fake_loss=1.07, generator_loss=30.52, generator_mel_loss=18.8, generator_kl_loss=1.964, generator_dur_loss=1.495, generator_adv_loss=2.652, generator_feat_match_loss=5.609, over 64.00 samples.], tot_loss[discriminator_loss=2.276, discriminator_real_loss=1.196, discriminator_fake_loss=1.08, generator_loss=30.78, generator_mel_loss=19, generator_kl_loss=1.97, generator_dur_loss=1.488, generator_adv_loss=2.614, generator_feat_match_loss=5.711, over 131.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2024-02-24 13:04:35,559 INFO [train.py:845] (3/4) Start epoch 879 2024-02-24 13:05:59,352 INFO [train.py:471] (3/4) Epoch 879, batch 14, global_batch_idx: 32500, batch size: 64, loss[discriminator_loss=2.557, discriminator_real_loss=1.208, discriminator_fake_loss=1.349, generator_loss=30.49, generator_mel_loss=19.79, generator_kl_loss=2.069, generator_dur_loss=1.476, generator_adv_loss=2.266, generator_feat_match_loss=4.887, over 64.00 samples.], tot_loss[discriminator_loss=2.484, discriminator_real_loss=1.257, discriminator_fake_loss=1.227, generator_loss=30.47, generator_mel_loss=19.67, generator_kl_loss=1.975, generator_dur_loss=1.472, generator_adv_loss=2.349, generator_feat_match_loss=5.002, over 1015.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:08:02,260 INFO [train.py:845] (3/4) Start epoch 880 2024-02-24 13:10:44,712 INFO [train.py:471] (3/4) Epoch 880, batch 27, global_batch_idx: 32550, batch size: 52, loss[discriminator_loss=2.42, discriminator_real_loss=1.282, discriminator_fake_loss=1.138, generator_loss=29.32, generator_mel_loss=18.55, generator_kl_loss=1.849, generator_dur_loss=1.502, generator_adv_loss=2.457, generator_feat_match_loss=4.961, over 52.00 samples.], tot_loss[discriminator_loss=2.386, discriminator_real_loss=1.18, discriminator_fake_loss=1.206, generator_loss=30.67, generator_mel_loss=19.3, generator_kl_loss=1.965, generator_dur_loss=1.469, generator_adv_loss=2.475, generator_feat_match_loss=5.461, over 2032.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:11:31,053 INFO [train.py:845] (3/4) Start epoch 881 2024-02-24 13:14:49,968 INFO [train.py:845] (3/4) Start epoch 882 2024-02-24 13:15:16,955 INFO [train.py:471] (3/4) Epoch 882, batch 3, global_batch_idx: 32600, batch size: 76, loss[discriminator_loss=2.371, discriminator_real_loss=1.218, discriminator_fake_loss=1.154, generator_loss=30.05, generator_mel_loss=18.65, generator_kl_loss=2.008, generator_dur_loss=1.498, generator_adv_loss=2.385, generator_feat_match_loss=5.508, over 76.00 samples.], tot_loss[discriminator_loss=2.4, discriminator_real_loss=1.246, discriminator_fake_loss=1.154, generator_loss=30.38, generator_mel_loss=19.02, generator_kl_loss=1.96, generator_dur_loss=1.484, generator_adv_loss=2.501, generator_feat_match_loss=5.422, over 242.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:15:16,956 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 13:15:26,282 INFO [train.py:534] (3/4) Epoch 882, validation: discriminator_loss=2.327, discriminator_real_loss=0.9906, discriminator_fake_loss=1.336, generator_loss=31.93, generator_mel_loss=20.02, generator_kl_loss=2.166, generator_dur_loss=1.475, generator_adv_loss=2.37, generator_feat_match_loss=5.899, over 100.00 samples. 2024-02-24 13:15:26,282 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 13:18:30,415 INFO [train.py:845] (3/4) Start epoch 883 2024-02-24 13:20:03,095 INFO [train.py:471] (3/4) Epoch 883, batch 16, global_batch_idx: 32650, batch size: 65, loss[discriminator_loss=2.35, discriminator_real_loss=1.143, discriminator_fake_loss=1.207, generator_loss=30.54, generator_mel_loss=18.97, generator_kl_loss=2.028, generator_dur_loss=1.489, generator_adv_loss=2.531, generator_feat_match_loss=5.516, over 65.00 samples.], tot_loss[discriminator_loss=2.378, discriminator_real_loss=1.188, discriminator_fake_loss=1.191, generator_loss=30.49, generator_mel_loss=19.2, generator_kl_loss=1.961, generator_dur_loss=1.466, generator_adv_loss=2.467, generator_feat_match_loss=5.388, over 1087.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:21:50,906 INFO [train.py:845] (3/4) Start epoch 884 2024-02-24 13:24:36,071 INFO [train.py:471] (3/4) Epoch 884, batch 29, global_batch_idx: 32700, batch size: 67, loss[discriminator_loss=2.342, discriminator_real_loss=1.195, discriminator_fake_loss=1.146, generator_loss=30.44, generator_mel_loss=18.9, generator_kl_loss=1.946, generator_dur_loss=1.474, generator_adv_loss=2.551, generator_feat_match_loss=5.566, over 67.00 samples.], tot_loss[discriminator_loss=2.353, discriminator_real_loss=1.178, discriminator_fake_loss=1.175, generator_loss=30.88, generator_mel_loss=19.36, generator_kl_loss=1.961, generator_dur_loss=1.465, generator_adv_loss=2.505, generator_feat_match_loss=5.587, over 2275.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:25:21,459 INFO [train.py:845] (3/4) Start epoch 885 2024-02-24 13:28:52,810 INFO [train.py:845] (3/4) Start epoch 886 2024-02-24 13:29:34,534 INFO [train.py:471] (3/4) Epoch 886, batch 5, global_batch_idx: 32750, batch size: 61, loss[discriminator_loss=2.607, discriminator_real_loss=1.234, discriminator_fake_loss=1.373, generator_loss=29.88, generator_mel_loss=19.15, generator_kl_loss=1.957, generator_dur_loss=1.463, generator_adv_loss=2.219, generator_feat_match_loss=5.098, over 61.00 samples.], tot_loss[discriminator_loss=2.454, discriminator_real_loss=1.226, discriminator_fake_loss=1.228, generator_loss=30.33, generator_mel_loss=19.38, generator_kl_loss=1.981, generator_dur_loss=1.46, generator_adv_loss=2.397, generator_feat_match_loss=5.116, over 433.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:32:19,036 INFO [train.py:845] (3/4) Start epoch 887 2024-02-24 13:34:10,354 INFO [train.py:471] (3/4) Epoch 887, batch 18, global_batch_idx: 32800, batch size: 63, loss[discriminator_loss=2.23, discriminator_real_loss=1.104, discriminator_fake_loss=1.128, generator_loss=30.91, generator_mel_loss=18.9, generator_kl_loss=1.868, generator_dur_loss=1.426, generator_adv_loss=2.695, generator_feat_match_loss=6.02, over 63.00 samples.], tot_loss[discriminator_loss=2.342, discriminator_real_loss=1.186, discriminator_fake_loss=1.157, generator_loss=30.68, generator_mel_loss=19.12, generator_kl_loss=1.959, generator_dur_loss=1.463, generator_adv_loss=2.518, generator_feat_match_loss=5.622, over 1370.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2024-02-24 13:34:10,356 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 13:34:19,058 INFO [train.py:534] (3/4) Epoch 887, validation: discriminator_loss=2.258, discriminator_real_loss=1.123, discriminator_fake_loss=1.135, generator_loss=32.18, generator_mel_loss=20.02, generator_kl_loss=2.092, generator_dur_loss=1.473, generator_adv_loss=2.536, generator_feat_match_loss=6.053, over 100.00 samples. 2024-02-24 13:34:19,059 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 13:35:55,271 INFO [train.py:845] (3/4) Start epoch 888 2024-02-24 13:38:41,305 INFO [train.py:471] (3/4) Epoch 888, batch 31, global_batch_idx: 32850, batch size: 101, loss[discriminator_loss=2.266, discriminator_real_loss=1.326, discriminator_fake_loss=0.9404, generator_loss=31, generator_mel_loss=19.12, generator_kl_loss=1.968, generator_dur_loss=1.453, generator_adv_loss=2.406, generator_feat_match_loss=6.051, over 101.00 samples.], tot_loss[discriminator_loss=2.329, discriminator_real_loss=1.171, discriminator_fake_loss=1.158, generator_loss=30.73, generator_mel_loss=19.14, generator_kl_loss=1.963, generator_dur_loss=1.464, generator_adv_loss=2.525, generator_feat_match_loss=5.64, over 2327.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:39:16,411 INFO [train.py:845] (3/4) Start epoch 889 2024-02-24 13:42:44,419 INFO [train.py:845] (3/4) Start epoch 890 2024-02-24 13:43:36,185 INFO [train.py:471] (3/4) Epoch 890, batch 7, global_batch_idx: 32900, batch size: 153, loss[discriminator_loss=2.273, discriminator_real_loss=1.086, discriminator_fake_loss=1.187, generator_loss=30.9, generator_mel_loss=19.26, generator_kl_loss=1.928, generator_dur_loss=1.427, generator_adv_loss=2.559, generator_feat_match_loss=5.727, over 153.00 samples.], tot_loss[discriminator_loss=2.313, discriminator_real_loss=1.152, discriminator_fake_loss=1.16, generator_loss=30.51, generator_mel_loss=19.1, generator_kl_loss=1.951, generator_dur_loss=1.457, generator_adv_loss=2.501, generator_feat_match_loss=5.506, over 651.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:46:13,345 INFO [train.py:845] (3/4) Start epoch 891 2024-02-24 13:48:17,008 INFO [train.py:471] (3/4) Epoch 891, batch 20, global_batch_idx: 32950, batch size: 67, loss[discriminator_loss=2.299, discriminator_real_loss=1.172, discriminator_fake_loss=1.127, generator_loss=31.47, generator_mel_loss=19.74, generator_kl_loss=1.891, generator_dur_loss=1.46, generator_adv_loss=2.541, generator_feat_match_loss=5.84, over 67.00 samples.], tot_loss[discriminator_loss=2.345, discriminator_real_loss=1.189, discriminator_fake_loss=1.156, generator_loss=30.88, generator_mel_loss=19.29, generator_kl_loss=1.964, generator_dur_loss=1.466, generator_adv_loss=2.54, generator_feat_match_loss=5.617, over 1532.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:49:48,935 INFO [train.py:845] (3/4) Start epoch 892 2024-02-24 13:53:02,796 INFO [train.py:471] (3/4) Epoch 892, batch 33, global_batch_idx: 33000, batch size: 50, loss[discriminator_loss=2.32, discriminator_real_loss=1.128, discriminator_fake_loss=1.193, generator_loss=30.62, generator_mel_loss=19.05, generator_kl_loss=1.923, generator_dur_loss=1.469, generator_adv_loss=2.539, generator_feat_match_loss=5.641, over 50.00 samples.], tot_loss[discriminator_loss=2.352, discriminator_real_loss=1.191, discriminator_fake_loss=1.161, generator_loss=30.86, generator_mel_loss=19.25, generator_kl_loss=1.967, generator_dur_loss=1.463, generator_adv_loss=2.541, generator_feat_match_loss=5.647, over 2454.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 13:53:02,798 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 13:53:11,547 INFO [train.py:534] (3/4) Epoch 892, validation: discriminator_loss=2.249, discriminator_real_loss=1.092, discriminator_fake_loss=1.157, generator_loss=32.05, generator_mel_loss=19.98, generator_kl_loss=2.083, generator_dur_loss=1.472, generator_adv_loss=2.481, generator_feat_match_loss=6.025, over 100.00 samples. 2024-02-24 13:53:11,547 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 13:53:27,079 INFO [train.py:845] (3/4) Start epoch 893 2024-02-24 13:56:56,840 INFO [train.py:845] (3/4) Start epoch 894 2024-02-24 13:57:53,621 INFO [train.py:471] (3/4) Epoch 894, batch 9, global_batch_idx: 33050, batch size: 76, loss[discriminator_loss=2.279, discriminator_real_loss=1.348, discriminator_fake_loss=0.9316, generator_loss=30.91, generator_mel_loss=19.16, generator_kl_loss=1.963, generator_dur_loss=1.465, generator_adv_loss=2.633, generator_feat_match_loss=5.695, over 76.00 samples.], tot_loss[discriminator_loss=2.315, discriminator_real_loss=1.221, discriminator_fake_loss=1.094, generator_loss=30.99, generator_mel_loss=19.05, generator_kl_loss=1.935, generator_dur_loss=1.461, generator_adv_loss=2.635, generator_feat_match_loss=5.908, over 744.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:00:27,427 INFO [train.py:845] (3/4) Start epoch 895 2024-02-24 14:02:38,230 INFO [train.py:471] (3/4) Epoch 895, batch 22, global_batch_idx: 33100, batch size: 55, loss[discriminator_loss=2.512, discriminator_real_loss=1.192, discriminator_fake_loss=1.319, generator_loss=30.79, generator_mel_loss=19.74, generator_kl_loss=1.991, generator_dur_loss=1.519, generator_adv_loss=2.484, generator_feat_match_loss=5.051, over 55.00 samples.], tot_loss[discriminator_loss=2.428, discriminator_real_loss=1.228, discriminator_fake_loss=1.2, generator_loss=30.72, generator_mel_loss=19.42, generator_kl_loss=2.004, generator_dur_loss=1.469, generator_adv_loss=2.462, generator_feat_match_loss=5.363, over 1700.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:03:57,561 INFO [train.py:845] (3/4) Start epoch 896 2024-02-24 14:07:24,302 INFO [train.py:471] (3/4) Epoch 896, batch 35, global_batch_idx: 33150, batch size: 126, loss[discriminator_loss=2.32, discriminator_real_loss=1.12, discriminator_fake_loss=1.2, generator_loss=30.88, generator_mel_loss=19.2, generator_kl_loss=1.984, generator_dur_loss=1.447, generator_adv_loss=2.598, generator_feat_match_loss=5.648, over 126.00 samples.], tot_loss[discriminator_loss=2.34, discriminator_real_loss=1.175, discriminator_fake_loss=1.165, generator_loss=30.73, generator_mel_loss=19.19, generator_kl_loss=1.968, generator_dur_loss=1.465, generator_adv_loss=2.509, generator_feat_match_loss=5.595, over 2705.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:07:28,551 INFO [train.py:845] (3/4) Start epoch 897 2024-02-24 14:10:57,453 INFO [train.py:845] (3/4) Start epoch 898 2024-02-24 14:12:11,332 INFO [train.py:471] (3/4) Epoch 898, batch 11, global_batch_idx: 33200, batch size: 65, loss[discriminator_loss=2.416, discriminator_real_loss=1.221, discriminator_fake_loss=1.195, generator_loss=30.04, generator_mel_loss=18.71, generator_kl_loss=1.987, generator_dur_loss=1.468, generator_adv_loss=2.512, generator_feat_match_loss=5.363, over 65.00 samples.], tot_loss[discriminator_loss=2.405, discriminator_real_loss=1.228, discriminator_fake_loss=1.178, generator_loss=30.82, generator_mel_loss=19.24, generator_kl_loss=1.981, generator_dur_loss=1.472, generator_adv_loss=2.543, generator_feat_match_loss=5.586, over 774.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2024-02-24 14:12:11,333 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 14:12:20,143 INFO [train.py:534] (3/4) Epoch 898, validation: discriminator_loss=2.307, discriminator_real_loss=1.142, discriminator_fake_loss=1.165, generator_loss=31.75, generator_mel_loss=20.1, generator_kl_loss=2.145, generator_dur_loss=1.465, generator_adv_loss=2.441, generator_feat_match_loss=5.593, over 100.00 samples. 2024-02-24 14:12:20,144 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 14:14:33,019 INFO [train.py:845] (3/4) Start epoch 899 2024-02-24 14:17:00,806 INFO [train.py:471] (3/4) Epoch 899, batch 24, global_batch_idx: 33250, batch size: 58, loss[discriminator_loss=2.477, discriminator_real_loss=1.157, discriminator_fake_loss=1.318, generator_loss=30.12, generator_mel_loss=18.87, generator_kl_loss=2.018, generator_dur_loss=1.481, generator_adv_loss=2.33, generator_feat_match_loss=5.426, over 58.00 samples.], tot_loss[discriminator_loss=2.385, discriminator_real_loss=1.224, discriminator_fake_loss=1.161, generator_loss=30.9, generator_mel_loss=19.15, generator_kl_loss=1.954, generator_dur_loss=1.461, generator_adv_loss=2.574, generator_feat_match_loss=5.76, over 1968.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:18:05,959 INFO [train.py:845] (3/4) Start epoch 900 2024-02-24 14:21:33,297 INFO [train.py:845] (3/4) Start epoch 901 2024-02-24 14:21:45,399 INFO [train.py:471] (3/4) Epoch 901, batch 0, global_batch_idx: 33300, batch size: 71, loss[discriminator_loss=2.414, discriminator_real_loss=1.309, discriminator_fake_loss=1.106, generator_loss=29.93, generator_mel_loss=18.8, generator_kl_loss=1.947, generator_dur_loss=1.457, generator_adv_loss=2.51, generator_feat_match_loss=5.219, over 71.00 samples.], tot_loss[discriminator_loss=2.414, discriminator_real_loss=1.309, discriminator_fake_loss=1.106, generator_loss=29.93, generator_mel_loss=18.8, generator_kl_loss=1.947, generator_dur_loss=1.457, generator_adv_loss=2.51, generator_feat_match_loss=5.219, over 71.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:24:59,111 INFO [train.py:845] (3/4) Start epoch 902 2024-02-24 14:26:17,706 INFO [train.py:471] (3/4) Epoch 902, batch 13, global_batch_idx: 33350, batch size: 90, loss[discriminator_loss=2.746, discriminator_real_loss=1.373, discriminator_fake_loss=1.372, generator_loss=29.68, generator_mel_loss=19.68, generator_kl_loss=2.059, generator_dur_loss=1.471, generator_adv_loss=2.188, generator_feat_match_loss=4.285, over 90.00 samples.], tot_loss[discriminator_loss=2.531, discriminator_real_loss=1.281, discriminator_fake_loss=1.249, generator_loss=30.54, generator_mel_loss=19.81, generator_kl_loss=1.983, generator_dur_loss=1.47, generator_adv_loss=2.335, generator_feat_match_loss=4.947, over 1031.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:28:30,426 INFO [train.py:845] (3/4) Start epoch 903 2024-02-24 14:30:59,284 INFO [train.py:471] (3/4) Epoch 903, batch 26, global_batch_idx: 33400, batch size: 81, loss[discriminator_loss=2.379, discriminator_real_loss=1.223, discriminator_fake_loss=1.157, generator_loss=30.45, generator_mel_loss=19.3, generator_kl_loss=2.065, generator_dur_loss=1.464, generator_adv_loss=2.439, generator_feat_match_loss=5.188, over 81.00 samples.], tot_loss[discriminator_loss=2.434, discriminator_real_loss=1.24, discriminator_fake_loss=1.194, generator_loss=30.43, generator_mel_loss=19.13, generator_kl_loss=1.947, generator_dur_loss=1.464, generator_adv_loss=2.49, generator_feat_match_loss=5.403, over 2104.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:30:59,286 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 14:31:08,286 INFO [train.py:534] (3/4) Epoch 903, validation: discriminator_loss=2.417, discriminator_real_loss=1.134, discriminator_fake_loss=1.283, generator_loss=30.38, generator_mel_loss=19.67, generator_kl_loss=2.172, generator_dur_loss=1.468, generator_adv_loss=2.054, generator_feat_match_loss=5.017, over 100.00 samples. 2024-02-24 14:31:08,286 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 14:32:04,214 INFO [train.py:845] (3/4) Start epoch 904 2024-02-24 14:35:33,419 INFO [train.py:845] (3/4) Start epoch 905 2024-02-24 14:35:53,370 INFO [train.py:471] (3/4) Epoch 905, batch 2, global_batch_idx: 33450, batch size: 52, loss[discriminator_loss=2.398, discriminator_real_loss=1.16, discriminator_fake_loss=1.238, generator_loss=30.54, generator_mel_loss=19.1, generator_kl_loss=1.976, generator_dur_loss=1.473, generator_adv_loss=2.545, generator_feat_match_loss=5.453, over 52.00 samples.], tot_loss[discriminator_loss=2.294, discriminator_real_loss=1.126, discriminator_fake_loss=1.168, generator_loss=31.01, generator_mel_loss=19.28, generator_kl_loss=1.999, generator_dur_loss=1.477, generator_adv_loss=2.545, generator_feat_match_loss=5.705, over 170.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:38:58,780 INFO [train.py:845] (3/4) Start epoch 906 2024-02-24 14:40:35,458 INFO [train.py:471] (3/4) Epoch 906, batch 15, global_batch_idx: 33500, batch size: 59, loss[discriminator_loss=2.35, discriminator_real_loss=1.189, discriminator_fake_loss=1.16, generator_loss=30.08, generator_mel_loss=18.9, generator_kl_loss=1.969, generator_dur_loss=1.458, generator_adv_loss=2.369, generator_feat_match_loss=5.383, over 59.00 samples.], tot_loss[discriminator_loss=2.341, discriminator_real_loss=1.192, discriminator_fake_loss=1.15, generator_loss=30.94, generator_mel_loss=19.3, generator_kl_loss=2.001, generator_dur_loss=1.464, generator_adv_loss=2.518, generator_feat_match_loss=5.663, over 1227.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:42:25,913 INFO [train.py:845] (3/4) Start epoch 907 2024-02-24 14:45:17,460 INFO [train.py:471] (3/4) Epoch 907, batch 28, global_batch_idx: 33550, batch size: 55, loss[discriminator_loss=2.283, discriminator_real_loss=1.073, discriminator_fake_loss=1.21, generator_loss=31.09, generator_mel_loss=19.21, generator_kl_loss=2.016, generator_dur_loss=1.484, generator_adv_loss=2.486, generator_feat_match_loss=5.898, over 55.00 samples.], tot_loss[discriminator_loss=2.406, discriminator_real_loss=1.212, discriminator_fake_loss=1.194, generator_loss=30.75, generator_mel_loss=19.39, generator_kl_loss=1.975, generator_dur_loss=1.461, generator_adv_loss=2.454, generator_feat_match_loss=5.473, over 2296.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:45:55,212 INFO [train.py:845] (3/4) Start epoch 908 2024-02-24 14:49:21,586 INFO [train.py:845] (3/4) Start epoch 909 2024-02-24 14:49:55,814 INFO [train.py:471] (3/4) Epoch 909, batch 4, global_batch_idx: 33600, batch size: 61, loss[discriminator_loss=2.34, discriminator_real_loss=1.108, discriminator_fake_loss=1.232, generator_loss=30.33, generator_mel_loss=18.78, generator_kl_loss=1.892, generator_dur_loss=1.476, generator_adv_loss=2.576, generator_feat_match_loss=5.598, over 61.00 samples.], tot_loss[discriminator_loss=2.301, discriminator_real_loss=1.188, discriminator_fake_loss=1.113, generator_loss=30.82, generator_mel_loss=19.03, generator_kl_loss=1.986, generator_dur_loss=1.472, generator_adv_loss=2.564, generator_feat_match_loss=5.772, over 337.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 16.0 2024-02-24 14:49:55,815 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 14:50:04,839 INFO [train.py:534] (3/4) Epoch 909, validation: discriminator_loss=2.548, discriminator_real_loss=1.19, discriminator_fake_loss=1.359, generator_loss=31.49, generator_mel_loss=20.29, generator_kl_loss=2.146, generator_dur_loss=1.466, generator_adv_loss=2.153, generator_feat_match_loss=5.436, over 100.00 samples. 2024-02-24 14:50:04,840 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 14:53:01,649 INFO [train.py:845] (3/4) Start epoch 910 2024-02-24 14:54:49,491 INFO [train.py:471] (3/4) Epoch 910, batch 17, global_batch_idx: 33650, batch size: 90, loss[discriminator_loss=2.393, discriminator_real_loss=1.195, discriminator_fake_loss=1.197, generator_loss=30.42, generator_mel_loss=19.39, generator_kl_loss=1.928, generator_dur_loss=1.431, generator_adv_loss=2.436, generator_feat_match_loss=5.242, over 90.00 samples.], tot_loss[discriminator_loss=2.377, discriminator_real_loss=1.188, discriminator_fake_loss=1.189, generator_loss=30.57, generator_mel_loss=19.25, generator_kl_loss=1.98, generator_dur_loss=1.46, generator_adv_loss=2.435, generator_feat_match_loss=5.446, over 1426.00 samples.], cur_lr_g: 1.79e-04, cur_lr_d: 1.79e-04, grad_scale: 8.0 2024-02-24 14:56:31,441 INFO [train.py:845] (3/4) Start epoch 911 2024-02-24 14:59:31,437 INFO [train.py:471] (3/4) Epoch 911, batch 30, global_batch_idx: 33700, batch size: 90, loss[discriminator_loss=2.477, discriminator_real_loss=1.346, discriminator_fake_loss=1.13, generator_loss=30.94, generator_mel_loss=19.36, generator_kl_loss=1.92, generator_dur_loss=1.463, generator_adv_loss=2.6, generator_feat_match_loss=5.598, over 90.00 samples.], tot_loss[discriminator_loss=2.474, discriminator_real_loss=1.244, discriminator_fake_loss=1.23, generator_loss=30.63, generator_mel_loss=19.44, generator_kl_loss=1.965, generator_dur_loss=1.466, generator_adv_loss=2.458, generator_feat_match_loss=5.308, over 2142.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 15:00:05,282 INFO [train.py:845] (3/4) Start epoch 912 2024-02-24 15:03:29,650 INFO [train.py:845] (3/4) Start epoch 913 2024-02-24 15:04:14,028 INFO [train.py:471] (3/4) Epoch 913, batch 6, global_batch_idx: 33750, batch size: 52, loss[discriminator_loss=2.168, discriminator_real_loss=1.024, discriminator_fake_loss=1.144, generator_loss=30.6, generator_mel_loss=18.53, generator_kl_loss=1.931, generator_dur_loss=1.463, generator_adv_loss=2.605, generator_feat_match_loss=6.07, over 52.00 samples.], tot_loss[discriminator_loss=2.353, discriminator_real_loss=1.172, discriminator_fake_loss=1.181, generator_loss=30.46, generator_mel_loss=18.91, generator_kl_loss=1.973, generator_dur_loss=1.461, generator_adv_loss=2.498, generator_feat_match_loss=5.621, over 492.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 15:06:55,837 INFO [train.py:845] (3/4) Start epoch 914 2024-02-24 15:08:58,307 INFO [train.py:471] (3/4) Epoch 914, batch 19, global_batch_idx: 33800, batch size: 58, loss[discriminator_loss=2.4, discriminator_real_loss=1.229, discriminator_fake_loss=1.171, generator_loss=30.67, generator_mel_loss=19.53, generator_kl_loss=2.127, generator_dur_loss=1.47, generator_adv_loss=2.445, generator_feat_match_loss=5.098, over 58.00 samples.], tot_loss[discriminator_loss=2.383, discriminator_real_loss=1.191, discriminator_fake_loss=1.192, generator_loss=30.36, generator_mel_loss=19.17, generator_kl_loss=1.985, generator_dur_loss=1.455, generator_adv_loss=2.418, generator_feat_match_loss=5.335, over 1594.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 15:08:58,308 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 15:09:07,055 INFO [train.py:534] (3/4) Epoch 914, validation: discriminator_loss=2.447, discriminator_real_loss=1.237, discriminator_fake_loss=1.21, generator_loss=31.6, generator_mel_loss=20.09, generator_kl_loss=2.2, generator_dur_loss=1.469, generator_adv_loss=2.405, generator_feat_match_loss=5.437, over 100.00 samples. 2024-02-24 15:09:07,056 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 15:10:33,193 INFO [train.py:845] (3/4) Start epoch 915 2024-02-24 15:13:32,291 INFO [train.py:471] (3/4) Epoch 915, batch 32, global_batch_idx: 33850, batch size: 50, loss[discriminator_loss=2.312, discriminator_real_loss=1.128, discriminator_fake_loss=1.185, generator_loss=30.88, generator_mel_loss=19.16, generator_kl_loss=1.984, generator_dur_loss=1.47, generator_adv_loss=2.557, generator_feat_match_loss=5.707, over 50.00 samples.], tot_loss[discriminator_loss=2.4, discriminator_real_loss=1.208, discriminator_fake_loss=1.193, generator_loss=30.68, generator_mel_loss=19.4, generator_kl_loss=2.005, generator_dur_loss=1.466, generator_adv_loss=2.451, generator_feat_match_loss=5.358, over 2251.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 15:14:00,336 INFO [train.py:845] (3/4) Start epoch 916 2024-02-24 15:17:33,351 INFO [train.py:845] (3/4) Start epoch 917 2024-02-24 15:18:25,058 INFO [train.py:471] (3/4) Epoch 917, batch 8, global_batch_idx: 33900, batch size: 90, loss[discriminator_loss=2.367, discriminator_real_loss=1.164, discriminator_fake_loss=1.203, generator_loss=29.88, generator_mel_loss=18.77, generator_kl_loss=1.986, generator_dur_loss=1.449, generator_adv_loss=2.369, generator_feat_match_loss=5.301, over 90.00 samples.], tot_loss[discriminator_loss=2.334, discriminator_real_loss=1.141, discriminator_fake_loss=1.194, generator_loss=30.54, generator_mel_loss=19.03, generator_kl_loss=1.987, generator_dur_loss=1.469, generator_adv_loss=2.472, generator_feat_match_loss=5.577, over 607.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 15:20:58,426 INFO [train.py:845] (3/4) Start epoch 918 2024-02-24 15:23:06,481 INFO [train.py:471] (3/4) Epoch 918, batch 21, global_batch_idx: 33950, batch size: 153, loss[discriminator_loss=2.184, discriminator_real_loss=1.028, discriminator_fake_loss=1.155, generator_loss=31.62, generator_mel_loss=19.23, generator_kl_loss=1.951, generator_dur_loss=1.436, generator_adv_loss=2.586, generator_feat_match_loss=6.414, over 153.00 samples.], tot_loss[discriminator_loss=2.33, discriminator_real_loss=1.169, discriminator_fake_loss=1.16, generator_loss=30.83, generator_mel_loss=19.1, generator_kl_loss=1.98, generator_dur_loss=1.463, generator_adv_loss=2.541, generator_feat_match_loss=5.74, over 1721.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 15:24:28,340 INFO [train.py:845] (3/4) Start epoch 919 2024-02-24 15:27:44,036 INFO [train.py:471] (3/4) Epoch 919, batch 34, global_batch_idx: 34000, batch size: 126, loss[discriminator_loss=2.234, discriminator_real_loss=1.064, discriminator_fake_loss=1.17, generator_loss=31.12, generator_mel_loss=19.15, generator_kl_loss=1.919, generator_dur_loss=1.461, generator_adv_loss=2.531, generator_feat_match_loss=6.062, over 126.00 samples.], tot_loss[discriminator_loss=2.337, discriminator_real_loss=1.175, discriminator_fake_loss=1.162, generator_loss=30.54, generator_mel_loss=18.96, generator_kl_loss=1.956, generator_dur_loss=1.465, generator_adv_loss=2.524, generator_feat_match_loss=5.635, over 2656.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2024-02-24 15:27:44,038 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 15:27:53,620 INFO [train.py:534] (3/4) Epoch 919, validation: discriminator_loss=2.372, discriminator_real_loss=1.08, discriminator_fake_loss=1.292, generator_loss=31.4, generator_mel_loss=19.76, generator_kl_loss=2.195, generator_dur_loss=1.476, generator_adv_loss=2.246, generator_feat_match_loss=5.725, over 100.00 samples. 2024-02-24 15:27:53,620 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 15:28:04,602 INFO [train.py:845] (3/4) Start epoch 920 2024-02-24 15:31:31,447 INFO [train.py:845] (3/4) Start epoch 921 2024-02-24 15:32:38,476 INFO [train.py:471] (3/4) Epoch 921, batch 10, global_batch_idx: 34050, batch size: 76, loss[discriminator_loss=2.59, discriminator_real_loss=1.145, discriminator_fake_loss=1.444, generator_loss=30.23, generator_mel_loss=19.13, generator_kl_loss=1.92, generator_dur_loss=1.455, generator_adv_loss=2.396, generator_feat_match_loss=5.324, over 76.00 samples.], tot_loss[discriminator_loss=2.342, discriminator_real_loss=1.212, discriminator_fake_loss=1.13, generator_loss=31.05, generator_mel_loss=19.12, generator_kl_loss=1.97, generator_dur_loss=1.462, generator_adv_loss=2.589, generator_feat_match_loss=5.912, over 869.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2024-02-24 15:34:59,419 INFO [train.py:845] (3/4) Start epoch 922 2024-02-24 15:37:14,138 INFO [train.py:471] (3/4) Epoch 922, batch 23, global_batch_idx: 34100, batch size: 61, loss[discriminator_loss=2.268, discriminator_real_loss=1.197, discriminator_fake_loss=1.07, generator_loss=31.39, generator_mel_loss=19.44, generator_kl_loss=2.099, generator_dur_loss=1.476, generator_adv_loss=2.572, generator_feat_match_loss=5.797, over 61.00 samples.], tot_loss[discriminator_loss=2.338, discriminator_real_loss=1.18, discriminator_fake_loss=1.158, generator_loss=30.7, generator_mel_loss=19.08, generator_kl_loss=1.979, generator_dur_loss=1.465, generator_adv_loss=2.517, generator_feat_match_loss=5.655, over 1754.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2024-02-24 15:38:23,824 INFO [train.py:845] (3/4) Start epoch 923 2024-02-24 15:41:50,063 INFO [train.py:471] (3/4) Epoch 923, batch 36, global_batch_idx: 34150, batch size: 71, loss[discriminator_loss=2.336, discriminator_real_loss=1.173, discriminator_fake_loss=1.164, generator_loss=30.53, generator_mel_loss=18.88, generator_kl_loss=2.071, generator_dur_loss=1.471, generator_adv_loss=2.516, generator_feat_match_loss=5.594, over 71.00 samples.], tot_loss[discriminator_loss=2.358, discriminator_real_loss=1.198, discriminator_fake_loss=1.16, generator_loss=30.72, generator_mel_loss=19.06, generator_kl_loss=1.973, generator_dur_loss=1.459, generator_adv_loss=2.544, generator_feat_match_loss=5.681, over 2892.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2024-02-24 15:41:50,534 INFO [train.py:845] (3/4) Start epoch 924 2024-02-24 15:45:18,067 INFO [train.py:845] (3/4) Start epoch 925 2024-02-24 15:46:35,566 INFO [train.py:471] (3/4) Epoch 925, batch 12, global_batch_idx: 34200, batch size: 50, loss[discriminator_loss=2.34, discriminator_real_loss=1.188, discriminator_fake_loss=1.153, generator_loss=30.08, generator_mel_loss=18.77, generator_kl_loss=1.97, generator_dur_loss=1.458, generator_adv_loss=2.531, generator_feat_match_loss=5.348, over 50.00 samples.], tot_loss[discriminator_loss=2.357, discriminator_real_loss=1.201, discriminator_fake_loss=1.156, generator_loss=30.72, generator_mel_loss=19.24, generator_kl_loss=1.938, generator_dur_loss=1.466, generator_adv_loss=2.52, generator_feat_match_loss=5.562, over 889.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 15:46:35,568 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 15:46:44,219 INFO [train.py:534] (3/4) Epoch 925, validation: discriminator_loss=2.279, discriminator_real_loss=1.121, discriminator_fake_loss=1.158, generator_loss=31.41, generator_mel_loss=19.57, generator_kl_loss=2.116, generator_dur_loss=1.468, generator_adv_loss=2.477, generator_feat_match_loss=5.784, over 100.00 samples. 2024-02-24 15:46:44,220 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 15:48:52,500 INFO [train.py:845] (3/4) Start epoch 926 2024-02-24 15:51:22,619 INFO [train.py:471] (3/4) Epoch 926, batch 25, global_batch_idx: 34250, batch size: 153, loss[discriminator_loss=2.508, discriminator_real_loss=1.272, discriminator_fake_loss=1.234, generator_loss=31.05, generator_mel_loss=19.58, generator_kl_loss=2.02, generator_dur_loss=1.425, generator_adv_loss=2.432, generator_feat_match_loss=5.598, over 153.00 samples.], tot_loss[discriminator_loss=2.51, discriminator_real_loss=1.268, discriminator_fake_loss=1.241, generator_loss=30.71, generator_mel_loss=19.69, generator_kl_loss=1.983, generator_dur_loss=1.462, generator_adv_loss=2.39, generator_feat_match_loss=5.179, over 2041.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 15:52:23,000 INFO [train.py:845] (3/4) Start epoch 927 2024-02-24 15:55:51,770 INFO [train.py:845] (3/4) Start epoch 928 2024-02-24 15:56:11,524 INFO [train.py:471] (3/4) Epoch 928, batch 1, global_batch_idx: 34300, batch size: 76, loss[discriminator_loss=2.248, discriminator_real_loss=1.078, discriminator_fake_loss=1.17, generator_loss=31.34, generator_mel_loss=19.3, generator_kl_loss=1.968, generator_dur_loss=1.488, generator_adv_loss=2.664, generator_feat_match_loss=5.914, over 76.00 samples.], tot_loss[discriminator_loss=2.248, discriminator_real_loss=1.111, discriminator_fake_loss=1.137, generator_loss=31.49, generator_mel_loss=19.52, generator_kl_loss=1.959, generator_dur_loss=1.469, generator_adv_loss=2.559, generator_feat_match_loss=5.981, over 186.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 15:59:19,096 INFO [train.py:845] (3/4) Start epoch 929 2024-02-24 16:00:46,610 INFO [train.py:471] (3/4) Epoch 929, batch 14, global_batch_idx: 34350, batch size: 73, loss[discriminator_loss=2.438, discriminator_real_loss=1.217, discriminator_fake_loss=1.221, generator_loss=31.63, generator_mel_loss=19.84, generator_kl_loss=2.029, generator_dur_loss=1.458, generator_adv_loss=2.535, generator_feat_match_loss=5.766, over 73.00 samples.], tot_loss[discriminator_loss=2.476, discriminator_real_loss=1.247, discriminator_fake_loss=1.228, generator_loss=30.71, generator_mel_loss=19.78, generator_kl_loss=2.004, generator_dur_loss=1.464, generator_adv_loss=2.355, generator_feat_match_loss=5.108, over 996.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 4.0 2024-02-24 16:02:48,830 INFO [train.py:845] (3/4) Start epoch 930 2024-02-24 16:05:25,082 INFO [train.py:471] (3/4) Epoch 930, batch 27, global_batch_idx: 34400, batch size: 60, loss[discriminator_loss=2.303, discriminator_real_loss=1.207, discriminator_fake_loss=1.096, generator_loss=30.98, generator_mel_loss=19.15, generator_kl_loss=1.947, generator_dur_loss=1.491, generator_adv_loss=2.539, generator_feat_match_loss=5.855, over 60.00 samples.], tot_loss[discriminator_loss=2.363, discriminator_real_loss=1.189, discriminator_fake_loss=1.173, generator_loss=30.91, generator_mel_loss=19.16, generator_kl_loss=1.962, generator_dur_loss=1.462, generator_adv_loss=2.549, generator_feat_match_loss=5.779, over 2049.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:05:25,083 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 16:05:33,348 INFO [train.py:534] (3/4) Epoch 930, validation: discriminator_loss=2.232, discriminator_real_loss=1.029, discriminator_fake_loss=1.203, generator_loss=31.67, generator_mel_loss=19.64, generator_kl_loss=2.172, generator_dur_loss=1.464, generator_adv_loss=2.472, generator_feat_match_loss=5.926, over 100.00 samples. 2024-02-24 16:05:33,349 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 16:06:25,553 INFO [train.py:845] (3/4) Start epoch 931 2024-02-24 16:09:49,732 INFO [train.py:845] (3/4) Start epoch 932 2024-02-24 16:10:14,897 INFO [train.py:471] (3/4) Epoch 932, batch 3, global_batch_idx: 34450, batch size: 73, loss[discriminator_loss=2.373, discriminator_real_loss=1.268, discriminator_fake_loss=1.105, generator_loss=30.15, generator_mel_loss=18.79, generator_kl_loss=1.899, generator_dur_loss=1.466, generator_adv_loss=2.484, generator_feat_match_loss=5.516, over 73.00 samples.], tot_loss[discriminator_loss=2.429, discriminator_real_loss=1.249, discriminator_fake_loss=1.179, generator_loss=30.07, generator_mel_loss=18.92, generator_kl_loss=1.949, generator_dur_loss=1.47, generator_adv_loss=2.466, generator_feat_match_loss=5.264, over 251.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:13:08,839 INFO [train.py:845] (3/4) Start epoch 933 2024-02-24 16:14:54,121 INFO [train.py:471] (3/4) Epoch 933, batch 16, global_batch_idx: 34500, batch size: 51, loss[discriminator_loss=2.234, discriminator_real_loss=1.039, discriminator_fake_loss=1.194, generator_loss=32.17, generator_mel_loss=19.91, generator_kl_loss=1.981, generator_dur_loss=1.492, generator_adv_loss=2.6, generator_feat_match_loss=6.18, over 51.00 samples.], tot_loss[discriminator_loss=2.33, discriminator_real_loss=1.177, discriminator_fake_loss=1.152, generator_loss=30.79, generator_mel_loss=19.19, generator_kl_loss=1.972, generator_dur_loss=1.46, generator_adv_loss=2.52, generator_feat_match_loss=5.643, over 1254.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:16:39,815 INFO [train.py:845] (3/4) Start epoch 934 2024-02-24 16:19:27,705 INFO [train.py:471] (3/4) Epoch 934, batch 29, global_batch_idx: 34550, batch size: 153, loss[discriminator_loss=2.418, discriminator_real_loss=1.26, discriminator_fake_loss=1.159, generator_loss=30.81, generator_mel_loss=19.31, generator_kl_loss=1.981, generator_dur_loss=1.44, generator_adv_loss=2.393, generator_feat_match_loss=5.688, over 153.00 samples.], tot_loss[discriminator_loss=2.366, discriminator_real_loss=1.192, discriminator_fake_loss=1.174, generator_loss=30.8, generator_mel_loss=19.12, generator_kl_loss=1.976, generator_dur_loss=1.463, generator_adv_loss=2.54, generator_feat_match_loss=5.7, over 2200.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:20:06,552 INFO [train.py:845] (3/4) Start epoch 935 2024-02-24 16:23:31,243 INFO [train.py:845] (3/4) Start epoch 936 2024-02-24 16:24:07,276 INFO [train.py:471] (3/4) Epoch 936, batch 5, global_batch_idx: 34600, batch size: 53, loss[discriminator_loss=2.377, discriminator_real_loss=1.183, discriminator_fake_loss=1.194, generator_loss=30.27, generator_mel_loss=19.03, generator_kl_loss=2.006, generator_dur_loss=1.475, generator_adv_loss=2.502, generator_feat_match_loss=5.254, over 53.00 samples.], tot_loss[discriminator_loss=2.371, discriminator_real_loss=1.188, discriminator_fake_loss=1.182, generator_loss=30.53, generator_mel_loss=19.31, generator_kl_loss=1.962, generator_dur_loss=1.475, generator_adv_loss=2.42, generator_feat_match_loss=5.361, over 382.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:24:07,279 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 16:24:15,900 INFO [train.py:534] (3/4) Epoch 936, validation: discriminator_loss=2.492, discriminator_real_loss=1.208, discriminator_fake_loss=1.284, generator_loss=30.81, generator_mel_loss=19.98, generator_kl_loss=2.176, generator_dur_loss=1.474, generator_adv_loss=2.089, generator_feat_match_loss=5.091, over 100.00 samples. 2024-02-24 16:24:15,900 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 16:27:06,526 INFO [train.py:845] (3/4) Start epoch 937 2024-02-24 16:28:47,140 INFO [train.py:471] (3/4) Epoch 937, batch 18, global_batch_idx: 34650, batch size: 54, loss[discriminator_loss=2.66, discriminator_real_loss=1.266, discriminator_fake_loss=1.395, generator_loss=30.98, generator_mel_loss=19.44, generator_kl_loss=1.932, generator_dur_loss=1.467, generator_adv_loss=2.867, generator_feat_match_loss=5.273, over 54.00 samples.], tot_loss[discriminator_loss=2.336, discriminator_real_loss=1.148, discriminator_fake_loss=1.188, generator_loss=30.9, generator_mel_loss=19.19, generator_kl_loss=1.955, generator_dur_loss=1.471, generator_adv_loss=2.554, generator_feat_match_loss=5.731, over 1222.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:30:36,673 INFO [train.py:845] (3/4) Start epoch 938 2024-02-24 16:33:39,039 INFO [train.py:471] (3/4) Epoch 938, batch 31, global_batch_idx: 34700, batch size: 69, loss[discriminator_loss=2.391, discriminator_real_loss=1.324, discriminator_fake_loss=1.066, generator_loss=30.66, generator_mel_loss=19.1, generator_kl_loss=2.008, generator_dur_loss=1.451, generator_adv_loss=2.562, generator_feat_match_loss=5.535, over 69.00 samples.], tot_loss[discriminator_loss=2.347, discriminator_real_loss=1.181, discriminator_fake_loss=1.165, generator_loss=30.67, generator_mel_loss=19.03, generator_kl_loss=1.97, generator_dur_loss=1.462, generator_adv_loss=2.525, generator_feat_match_loss=5.673, over 2326.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:34:05,009 INFO [train.py:845] (3/4) Start epoch 939 2024-02-24 16:37:34,002 INFO [train.py:845] (3/4) Start epoch 940 2024-02-24 16:38:27,738 INFO [train.py:471] (3/4) Epoch 940, batch 7, global_batch_idx: 34750, batch size: 56, loss[discriminator_loss=2.391, discriminator_real_loss=1.264, discriminator_fake_loss=1.126, generator_loss=30.37, generator_mel_loss=19.23, generator_kl_loss=1.99, generator_dur_loss=1.482, generator_adv_loss=2.439, generator_feat_match_loss=5.23, over 56.00 samples.], tot_loss[discriminator_loss=2.452, discriminator_real_loss=1.258, discriminator_fake_loss=1.193, generator_loss=30.68, generator_mel_loss=19.51, generator_kl_loss=2.051, generator_dur_loss=1.464, generator_adv_loss=2.411, generator_feat_match_loss=5.248, over 613.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:41:03,756 INFO [train.py:845] (3/4) Start epoch 941 2024-02-24 16:43:04,330 INFO [train.py:471] (3/4) Epoch 941, batch 20, global_batch_idx: 34800, batch size: 65, loss[discriminator_loss=2.391, discriminator_real_loss=1.115, discriminator_fake_loss=1.275, generator_loss=30.51, generator_mel_loss=18.85, generator_kl_loss=1.948, generator_dur_loss=1.439, generator_adv_loss=2.586, generator_feat_match_loss=5.684, over 65.00 samples.], tot_loss[discriminator_loss=2.487, discriminator_real_loss=1.277, discriminator_fake_loss=1.21, generator_loss=30.54, generator_mel_loss=19.38, generator_kl_loss=1.985, generator_dur_loss=1.463, generator_adv_loss=2.421, generator_feat_match_loss=5.292, over 1566.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2024-02-24 16:43:04,332 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 16:43:12,551 INFO [train.py:534] (3/4) Epoch 941, validation: discriminator_loss=2.856, discriminator_real_loss=1.061, discriminator_fake_loss=1.795, generator_loss=29.88, generator_mel_loss=19.8, generator_kl_loss=2.161, generator_dur_loss=1.477, generator_adv_loss=1.675, generator_feat_match_loss=4.763, over 100.00 samples. 2024-02-24 16:43:12,552 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 16:44:41,642 INFO [train.py:845] (3/4) Start epoch 942 2024-02-24 16:47:57,674 INFO [train.py:471] (3/4) Epoch 942, batch 33, global_batch_idx: 34850, batch size: 79, loss[discriminator_loss=2.484, discriminator_real_loss=1.311, discriminator_fake_loss=1.173, generator_loss=30.74, generator_mel_loss=19.59, generator_kl_loss=1.929, generator_dur_loss=1.49, generator_adv_loss=2.43, generator_feat_match_loss=5.297, over 79.00 samples.], tot_loss[discriminator_loss=2.395, discriminator_real_loss=1.212, discriminator_fake_loss=1.183, generator_loss=30.61, generator_mel_loss=19.22, generator_kl_loss=1.978, generator_dur_loss=1.459, generator_adv_loss=2.467, generator_feat_match_loss=5.481, over 2764.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:48:11,129 INFO [train.py:845] (3/4) Start epoch 943 2024-02-24 16:51:34,493 INFO [train.py:845] (3/4) Start epoch 944 2024-02-24 16:52:31,093 INFO [train.py:471] (3/4) Epoch 944, batch 9, global_batch_idx: 34900, batch size: 81, loss[discriminator_loss=2.301, discriminator_real_loss=1.13, discriminator_fake_loss=1.171, generator_loss=31.07, generator_mel_loss=19.23, generator_kl_loss=1.978, generator_dur_loss=1.467, generator_adv_loss=2.594, generator_feat_match_loss=5.797, over 81.00 samples.], tot_loss[discriminator_loss=2.348, discriminator_real_loss=1.181, discriminator_fake_loss=1.168, generator_loss=30.72, generator_mel_loss=19.14, generator_kl_loss=1.963, generator_dur_loss=1.469, generator_adv_loss=2.516, generator_feat_match_loss=5.636, over 704.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:55:04,604 INFO [train.py:845] (3/4) Start epoch 945 2024-02-24 16:57:13,548 INFO [train.py:471] (3/4) Epoch 945, batch 22, global_batch_idx: 34950, batch size: 51, loss[discriminator_loss=2.518, discriminator_real_loss=1.496, discriminator_fake_loss=1.021, generator_loss=30.33, generator_mel_loss=19, generator_kl_loss=2.005, generator_dur_loss=1.443, generator_adv_loss=2.594, generator_feat_match_loss=5.293, over 51.00 samples.], tot_loss[discriminator_loss=2.331, discriminator_real_loss=1.177, discriminator_fake_loss=1.153, generator_loss=30.94, generator_mel_loss=19.07, generator_kl_loss=1.965, generator_dur_loss=1.473, generator_adv_loss=2.58, generator_feat_match_loss=5.853, over 1604.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 16:58:30,240 INFO [train.py:845] (3/4) Start epoch 946 2024-02-24 17:01:55,576 INFO [train.py:471] (3/4) Epoch 946, batch 35, global_batch_idx: 35000, batch size: 126, loss[discriminator_loss=2.484, discriminator_real_loss=1.224, discriminator_fake_loss=1.262, generator_loss=31.03, generator_mel_loss=19.97, generator_kl_loss=1.969, generator_dur_loss=1.433, generator_adv_loss=2.375, generator_feat_match_loss=5.277, over 126.00 samples.], tot_loss[discriminator_loss=2.396, discriminator_real_loss=1.201, discriminator_fake_loss=1.195, generator_loss=30.64, generator_mel_loss=19.26, generator_kl_loss=1.967, generator_dur_loss=1.459, generator_adv_loss=2.458, generator_feat_match_loss=5.494, over 2798.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 17:01:55,578 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 17:02:04,506 INFO [train.py:534] (3/4) Epoch 946, validation: discriminator_loss=2.422, discriminator_real_loss=1.256, discriminator_fake_loss=1.166, generator_loss=31.43, generator_mel_loss=20.24, generator_kl_loss=2.194, generator_dur_loss=1.467, generator_adv_loss=2.341, generator_feat_match_loss=5.192, over 100.00 samples. 2024-02-24 17:02:04,507 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 17:02:10,819 INFO [train.py:845] (3/4) Start epoch 947 2024-02-24 17:05:37,444 INFO [train.py:845] (3/4) Start epoch 948 2024-02-24 17:06:48,274 INFO [train.py:471] (3/4) Epoch 948, batch 11, global_batch_idx: 35050, batch size: 79, loss[discriminator_loss=2.215, discriminator_real_loss=1.254, discriminator_fake_loss=0.9619, generator_loss=31.22, generator_mel_loss=19.14, generator_kl_loss=1.946, generator_dur_loss=1.47, generator_adv_loss=2.387, generator_feat_match_loss=6.281, over 79.00 samples.], tot_loss[discriminator_loss=2.367, discriminator_real_loss=1.157, discriminator_fake_loss=1.211, generator_loss=30.68, generator_mel_loss=18.97, generator_kl_loss=1.976, generator_dur_loss=1.462, generator_adv_loss=2.486, generator_feat_match_loss=5.784, over 884.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 17:09:05,027 INFO [train.py:845] (3/4) Start epoch 949 2024-02-24 17:11:34,825 INFO [train.py:471] (3/4) Epoch 949, batch 24, global_batch_idx: 35100, batch size: 90, loss[discriminator_loss=2.18, discriminator_real_loss=1.083, discriminator_fake_loss=1.096, generator_loss=31.2, generator_mel_loss=18.9, generator_kl_loss=1.947, generator_dur_loss=1.46, generator_adv_loss=2.623, generator_feat_match_loss=6.273, over 90.00 samples.], tot_loss[discriminator_loss=2.333, discriminator_real_loss=1.16, discriminator_fake_loss=1.173, generator_loss=30.5, generator_mel_loss=18.96, generator_kl_loss=1.964, generator_dur_loss=1.46, generator_adv_loss=2.505, generator_feat_match_loss=5.613, over 1948.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 17:12:38,430 INFO [train.py:845] (3/4) Start epoch 950 2024-02-24 17:16:10,897 INFO [train.py:845] (3/4) Start epoch 951 2024-02-24 17:16:24,163 INFO [train.py:471] (3/4) Epoch 951, batch 0, global_batch_idx: 35150, batch size: 52, loss[discriminator_loss=2.326, discriminator_real_loss=1.197, discriminator_fake_loss=1.129, generator_loss=30.55, generator_mel_loss=19.14, generator_kl_loss=1.964, generator_dur_loss=1.463, generator_adv_loss=2.465, generator_feat_match_loss=5.512, over 52.00 samples.], tot_loss[discriminator_loss=2.326, discriminator_real_loss=1.197, discriminator_fake_loss=1.129, generator_loss=30.55, generator_mel_loss=19.14, generator_kl_loss=1.964, generator_dur_loss=1.463, generator_adv_loss=2.465, generator_feat_match_loss=5.512, over 52.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 17:19:32,978 INFO [train.py:845] (3/4) Start epoch 952 2024-02-24 17:21:02,966 INFO [train.py:471] (3/4) Epoch 952, batch 13, global_batch_idx: 35200, batch size: 52, loss[discriminator_loss=2.336, discriminator_real_loss=1.164, discriminator_fake_loss=1.172, generator_loss=29.71, generator_mel_loss=18.51, generator_kl_loss=1.964, generator_dur_loss=1.465, generator_adv_loss=2.518, generator_feat_match_loss=5.25, over 52.00 samples.], tot_loss[discriminator_loss=2.332, discriminator_real_loss=1.191, discriminator_fake_loss=1.14, generator_loss=30.5, generator_mel_loss=18.86, generator_kl_loss=1.981, generator_dur_loss=1.467, generator_adv_loss=2.527, generator_feat_match_loss=5.656, over 975.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 16.0 2024-02-24 17:21:02,968 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 17:21:11,550 INFO [train.py:534] (3/4) Epoch 952, validation: discriminator_loss=2.25, discriminator_real_loss=1.106, discriminator_fake_loss=1.143, generator_loss=31.22, generator_mel_loss=19.46, generator_kl_loss=2.107, generator_dur_loss=1.469, generator_adv_loss=2.479, generator_feat_match_loss=5.705, over 100.00 samples. 2024-02-24 17:21:11,551 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 17:23:08,254 INFO [train.py:845] (3/4) Start epoch 953 2024-02-24 17:25:39,397 INFO [train.py:471] (3/4) Epoch 953, batch 26, global_batch_idx: 35250, batch size: 59, loss[discriminator_loss=2.18, discriminator_real_loss=0.9951, discriminator_fake_loss=1.184, generator_loss=30.93, generator_mel_loss=18.9, generator_kl_loss=1.89, generator_dur_loss=1.461, generator_adv_loss=2.643, generator_feat_match_loss=6.039, over 59.00 samples.], tot_loss[discriminator_loss=2.32, discriminator_real_loss=1.172, discriminator_fake_loss=1.148, generator_loss=30.86, generator_mel_loss=19.04, generator_kl_loss=1.992, generator_dur_loss=1.466, generator_adv_loss=2.552, generator_feat_match_loss=5.806, over 1960.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 17:26:33,537 INFO [train.py:845] (3/4) Start epoch 954 2024-02-24 17:30:04,924 INFO [train.py:845] (3/4) Start epoch 955 2024-02-24 17:30:27,645 INFO [train.py:471] (3/4) Epoch 955, batch 2, global_batch_idx: 35300, batch size: 49, loss[discriminator_loss=2.379, discriminator_real_loss=1.159, discriminator_fake_loss=1.219, generator_loss=31.16, generator_mel_loss=19.53, generator_kl_loss=1.915, generator_dur_loss=1.473, generator_adv_loss=2.562, generator_feat_match_loss=5.684, over 49.00 samples.], tot_loss[discriminator_loss=2.474, discriminator_real_loss=1.178, discriminator_fake_loss=1.296, generator_loss=30.58, generator_mel_loss=19.32, generator_kl_loss=1.963, generator_dur_loss=1.472, generator_adv_loss=2.359, generator_feat_match_loss=5.465, over 193.00 samples.], cur_lr_g: 1.78e-04, cur_lr_d: 1.78e-04, grad_scale: 8.0 2024-02-24 17:33:27,434 INFO [train.py:845] (3/4) Start epoch 956 2024-02-24 17:35:05,521 INFO [train.py:471] (3/4) Epoch 956, batch 15, global_batch_idx: 35350, batch size: 71, loss[discriminator_loss=2.682, discriminator_real_loss=1.633, discriminator_fake_loss=1.049, generator_loss=30.56, generator_mel_loss=18.94, generator_kl_loss=2.041, generator_dur_loss=1.47, generator_adv_loss=2.566, generator_feat_match_loss=5.547, over 71.00 samples.], tot_loss[discriminator_loss=2.337, discriminator_real_loss=1.156, discriminator_fake_loss=1.181, generator_loss=30.95, generator_mel_loss=19.01, generator_kl_loss=1.993, generator_dur_loss=1.46, generator_adv_loss=2.561, generator_feat_match_loss=5.924, over 1424.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 17:37:00,516 INFO [train.py:845] (3/4) Start epoch 957 2024-02-24 17:39:46,849 INFO [train.py:471] (3/4) Epoch 957, batch 28, global_batch_idx: 35400, batch size: 61, loss[discriminator_loss=2.383, discriminator_real_loss=1.229, discriminator_fake_loss=1.153, generator_loss=31.3, generator_mel_loss=20.01, generator_kl_loss=1.963, generator_dur_loss=1.447, generator_adv_loss=2.475, generator_feat_match_loss=5.402, over 61.00 samples.], tot_loss[discriminator_loss=2.358, discriminator_real_loss=1.189, discriminator_fake_loss=1.169, generator_loss=30.64, generator_mel_loss=19.19, generator_kl_loss=1.999, generator_dur_loss=1.464, generator_adv_loss=2.463, generator_feat_match_loss=5.523, over 2129.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 17:39:46,851 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 17:39:55,511 INFO [train.py:534] (3/4) Epoch 957, validation: discriminator_loss=2.405, discriminator_real_loss=1.142, discriminator_fake_loss=1.263, generator_loss=31.67, generator_mel_loss=20.33, generator_kl_loss=2.225, generator_dur_loss=1.474, generator_adv_loss=2.225, generator_feat_match_loss=5.418, over 100.00 samples. 2024-02-24 17:39:55,512 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 17:40:40,504 INFO [train.py:845] (3/4) Start epoch 958 2024-02-24 17:44:04,231 INFO [train.py:845] (3/4) Start epoch 959 2024-02-24 17:44:38,746 INFO [train.py:471] (3/4) Epoch 959, batch 4, global_batch_idx: 35450, batch size: 55, loss[discriminator_loss=2.363, discriminator_real_loss=1.422, discriminator_fake_loss=0.9414, generator_loss=31.75, generator_mel_loss=18.93, generator_kl_loss=2.03, generator_dur_loss=1.49, generator_adv_loss=2.881, generator_feat_match_loss=6.414, over 55.00 samples.], tot_loss[discriminator_loss=2.313, discriminator_real_loss=1.202, discriminator_fake_loss=1.111, generator_loss=31.58, generator_mel_loss=19.21, generator_kl_loss=1.986, generator_dur_loss=1.472, generator_adv_loss=2.673, generator_feat_match_loss=6.237, over 301.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 17:47:31,731 INFO [train.py:845] (3/4) Start epoch 960 2024-02-24 17:49:20,667 INFO [train.py:471] (3/4) Epoch 960, batch 17, global_batch_idx: 35500, batch size: 79, loss[discriminator_loss=2.432, discriminator_real_loss=1.247, discriminator_fake_loss=1.185, generator_loss=30.12, generator_mel_loss=19.19, generator_kl_loss=1.974, generator_dur_loss=1.454, generator_adv_loss=2.352, generator_feat_match_loss=5.148, over 79.00 samples.], tot_loss[discriminator_loss=2.435, discriminator_real_loss=1.223, discriminator_fake_loss=1.212, generator_loss=30.09, generator_mel_loss=19.07, generator_kl_loss=1.992, generator_dur_loss=1.459, generator_adv_loss=2.396, generator_feat_match_loss=5.167, over 1284.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 17:51:04,121 INFO [train.py:845] (3/4) Start epoch 961 2024-02-24 17:53:51,484 INFO [train.py:471] (3/4) Epoch 961, batch 30, global_batch_idx: 35550, batch size: 73, loss[discriminator_loss=2.336, discriminator_real_loss=1.207, discriminator_fake_loss=1.128, generator_loss=31.29, generator_mel_loss=19.5, generator_kl_loss=1.91, generator_dur_loss=1.465, generator_adv_loss=2.482, generator_feat_match_loss=5.93, over 73.00 samples.], tot_loss[discriminator_loss=2.344, discriminator_real_loss=1.175, discriminator_fake_loss=1.169, generator_loss=30.58, generator_mel_loss=19.07, generator_kl_loss=1.958, generator_dur_loss=1.466, generator_adv_loss=2.491, generator_feat_match_loss=5.595, over 2350.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 17:54:29,323 INFO [train.py:845] (3/4) Start epoch 962 2024-02-24 17:58:03,858 INFO [train.py:845] (3/4) Start epoch 963 2024-02-24 17:58:53,953 INFO [train.py:471] (3/4) Epoch 963, batch 6, global_batch_idx: 35600, batch size: 110, loss[discriminator_loss=2.365, discriminator_real_loss=1.207, discriminator_fake_loss=1.158, generator_loss=31.3, generator_mel_loss=19.74, generator_kl_loss=2.085, generator_dur_loss=1.486, generator_adv_loss=2.443, generator_feat_match_loss=5.543, over 110.00 samples.], tot_loss[discriminator_loss=2.429, discriminator_real_loss=1.231, discriminator_fake_loss=1.198, generator_loss=30.99, generator_mel_loss=19.73, generator_kl_loss=2.033, generator_dur_loss=1.471, generator_adv_loss=2.412, generator_feat_match_loss=5.343, over 545.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2024-02-24 17:58:53,955 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 17:59:02,580 INFO [train.py:534] (3/4) Epoch 963, validation: discriminator_loss=2.424, discriminator_real_loss=1.14, discriminator_fake_loss=1.285, generator_loss=31.29, generator_mel_loss=20.13, generator_kl_loss=2.256, generator_dur_loss=1.465, generator_adv_loss=2.192, generator_feat_match_loss=5.247, over 100.00 samples. 2024-02-24 17:59:02,580 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 18:01:42,454 INFO [train.py:845] (3/4) Start epoch 964 2024-02-24 18:03:37,126 INFO [train.py:471] (3/4) Epoch 964, batch 19, global_batch_idx: 35650, batch size: 69, loss[discriminator_loss=2.586, discriminator_real_loss=1.112, discriminator_fake_loss=1.475, generator_loss=29.61, generator_mel_loss=19.02, generator_kl_loss=2.001, generator_dur_loss=1.458, generator_adv_loss=2.229, generator_feat_match_loss=4.898, over 69.00 samples.], tot_loss[discriminator_loss=2.545, discriminator_real_loss=1.307, discriminator_fake_loss=1.238, generator_loss=30.64, generator_mel_loss=19.16, generator_kl_loss=1.958, generator_dur_loss=1.475, generator_adv_loss=2.573, generator_feat_match_loss=5.475, over 1253.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 4.0 2024-02-24 18:05:11,143 INFO [train.py:845] (3/4) Start epoch 965 2024-02-24 18:08:23,596 INFO [train.py:471] (3/4) Epoch 965, batch 32, global_batch_idx: 35700, batch size: 69, loss[discriminator_loss=2.861, discriminator_real_loss=1.277, discriminator_fake_loss=1.584, generator_loss=29.14, generator_mel_loss=19.15, generator_kl_loss=2.006, generator_dur_loss=1.478, generator_adv_loss=2.002, generator_feat_match_loss=4.512, over 69.00 samples.], tot_loss[discriminator_loss=2.495, discriminator_real_loss=1.253, discriminator_fake_loss=1.242, generator_loss=30.48, generator_mel_loss=19.41, generator_kl_loss=1.982, generator_dur_loss=1.463, generator_adv_loss=2.402, generator_feat_match_loss=5.222, over 2430.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:08:43,128 INFO [train.py:845] (3/4) Start epoch 966 2024-02-24 18:12:10,265 INFO [train.py:845] (3/4) Start epoch 967 2024-02-24 18:13:09,645 INFO [train.py:471] (3/4) Epoch 967, batch 8, global_batch_idx: 35750, batch size: 49, loss[discriminator_loss=2.369, discriminator_real_loss=1.251, discriminator_fake_loss=1.118, generator_loss=30.5, generator_mel_loss=18.98, generator_kl_loss=1.973, generator_dur_loss=1.484, generator_adv_loss=2.508, generator_feat_match_loss=5.559, over 49.00 samples.], tot_loss[discriminator_loss=2.346, discriminator_real_loss=1.168, discriminator_fake_loss=1.178, generator_loss=30.81, generator_mel_loss=19.3, generator_kl_loss=1.983, generator_dur_loss=1.47, generator_adv_loss=2.471, generator_feat_match_loss=5.586, over 667.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:15:44,280 INFO [train.py:845] (3/4) Start epoch 968 2024-02-24 18:17:47,936 INFO [train.py:471] (3/4) Epoch 968, batch 21, global_batch_idx: 35800, batch size: 81, loss[discriminator_loss=2.258, discriminator_real_loss=1.187, discriminator_fake_loss=1.07, generator_loss=31.08, generator_mel_loss=19.38, generator_kl_loss=1.99, generator_dur_loss=1.457, generator_adv_loss=2.584, generator_feat_match_loss=5.676, over 81.00 samples.], tot_loss[discriminator_loss=2.533, discriminator_real_loss=1.311, discriminator_fake_loss=1.221, generator_loss=30.79, generator_mel_loss=19.76, generator_kl_loss=2.037, generator_dur_loss=1.459, generator_adv_loss=2.389, generator_feat_match_loss=5.144, over 1751.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:17:47,938 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 18:17:56,226 INFO [train.py:534] (3/4) Epoch 968, validation: discriminator_loss=2.295, discriminator_real_loss=1.019, discriminator_fake_loss=1.275, generator_loss=32.14, generator_mel_loss=20.07, generator_kl_loss=2.278, generator_dur_loss=1.466, generator_adv_loss=2.424, generator_feat_match_loss=5.902, over 100.00 samples. 2024-02-24 18:17:56,227 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 18:19:13,567 INFO [train.py:845] (3/4) Start epoch 969 2024-02-24 18:22:31,548 INFO [train.py:471] (3/4) Epoch 969, batch 34, global_batch_idx: 35850, batch size: 90, loss[discriminator_loss=2.275, discriminator_real_loss=1.074, discriminator_fake_loss=1.201, generator_loss=30.56, generator_mel_loss=18.85, generator_kl_loss=1.903, generator_dur_loss=1.469, generator_adv_loss=2.588, generator_feat_match_loss=5.75, over 90.00 samples.], tot_loss[discriminator_loss=2.41, discriminator_real_loss=1.218, discriminator_fake_loss=1.192, generator_loss=30.59, generator_mel_loss=19.19, generator_kl_loss=1.959, generator_dur_loss=1.459, generator_adv_loss=2.462, generator_feat_match_loss=5.522, over 2876.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:22:43,620 INFO [train.py:845] (3/4) Start epoch 970 2024-02-24 18:26:09,411 INFO [train.py:845] (3/4) Start epoch 971 2024-02-24 18:27:18,423 INFO [train.py:471] (3/4) Epoch 971, batch 10, global_batch_idx: 35900, batch size: 67, loss[discriminator_loss=2.52, discriminator_real_loss=1.201, discriminator_fake_loss=1.317, generator_loss=30.44, generator_mel_loss=19.52, generator_kl_loss=1.995, generator_dur_loss=1.465, generator_adv_loss=2.393, generator_feat_match_loss=5.07, over 67.00 samples.], tot_loss[discriminator_loss=2.498, discriminator_real_loss=1.261, discriminator_fake_loss=1.237, generator_loss=30.65, generator_mel_loss=19.72, generator_kl_loss=2.029, generator_dur_loss=1.457, generator_adv_loss=2.326, generator_feat_match_loss=5.124, over 947.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:29:37,231 INFO [train.py:845] (3/4) Start epoch 972 2024-02-24 18:31:56,888 INFO [train.py:471] (3/4) Epoch 972, batch 23, global_batch_idx: 35950, batch size: 126, loss[discriminator_loss=2.301, discriminator_real_loss=1.176, discriminator_fake_loss=1.125, generator_loss=30.5, generator_mel_loss=18.96, generator_kl_loss=2.041, generator_dur_loss=1.444, generator_adv_loss=2.371, generator_feat_match_loss=5.684, over 126.00 samples.], tot_loss[discriminator_loss=2.43, discriminator_real_loss=1.226, discriminator_fake_loss=1.204, generator_loss=30.78, generator_mel_loss=19.26, generator_kl_loss=1.958, generator_dur_loss=1.461, generator_adv_loss=2.493, generator_feat_match_loss=5.613, over 1796.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:33:07,939 INFO [train.py:845] (3/4) Start epoch 973 2024-02-24 18:36:35,420 INFO [train.py:471] (3/4) Epoch 973, batch 36, global_batch_idx: 36000, batch size: 90, loss[discriminator_loss=2.277, discriminator_real_loss=1.158, discriminator_fake_loss=1.12, generator_loss=31.26, generator_mel_loss=19.31, generator_kl_loss=2.031, generator_dur_loss=1.477, generator_adv_loss=2.555, generator_feat_match_loss=5.887, over 90.00 samples.], tot_loss[discriminator_loss=2.428, discriminator_real_loss=1.224, discriminator_fake_loss=1.203, generator_loss=30.38, generator_mel_loss=19.01, generator_kl_loss=1.96, generator_dur_loss=1.466, generator_adv_loss=2.476, generator_feat_match_loss=5.469, over 2487.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2024-02-24 18:36:35,422 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 18:36:43,781 INFO [train.py:534] (3/4) Epoch 973, validation: discriminator_loss=2.281, discriminator_real_loss=1.118, discriminator_fake_loss=1.163, generator_loss=31.79, generator_mel_loss=19.75, generator_kl_loss=2.207, generator_dur_loss=1.475, generator_adv_loss=2.432, generator_feat_match_loss=5.929, over 100.00 samples. 2024-02-24 18:36:43,782 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 18:36:44,215 INFO [train.py:845] (3/4) Start epoch 974 2024-02-24 18:40:15,319 INFO [train.py:845] (3/4) Start epoch 975 2024-02-24 18:41:30,053 INFO [train.py:471] (3/4) Epoch 975, batch 12, global_batch_idx: 36050, batch size: 69, loss[discriminator_loss=2.504, discriminator_real_loss=1.309, discriminator_fake_loss=1.196, generator_loss=30.59, generator_mel_loss=19.64, generator_kl_loss=2.049, generator_dur_loss=1.472, generator_adv_loss=2.326, generator_feat_match_loss=5.098, over 69.00 samples.], tot_loss[discriminator_loss=2.459, discriminator_real_loss=1.25, discriminator_fake_loss=1.209, generator_loss=30.43, generator_mel_loss=19.56, generator_kl_loss=1.977, generator_dur_loss=1.47, generator_adv_loss=2.329, generator_feat_match_loss=5.086, over 1026.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:43:44,659 INFO [train.py:845] (3/4) Start epoch 976 2024-02-24 18:46:14,289 INFO [train.py:471] (3/4) Epoch 976, batch 25, global_batch_idx: 36100, batch size: 55, loss[discriminator_loss=2.506, discriminator_real_loss=1.309, discriminator_fake_loss=1.197, generator_loss=30.66, generator_mel_loss=19.39, generator_kl_loss=1.993, generator_dur_loss=1.503, generator_adv_loss=2.502, generator_feat_match_loss=5.27, over 55.00 samples.], tot_loss[discriminator_loss=2.491, discriminator_real_loss=1.241, discriminator_fake_loss=1.25, generator_loss=30.62, generator_mel_loss=19.44, generator_kl_loss=1.997, generator_dur_loss=1.461, generator_adv_loss=2.405, generator_feat_match_loss=5.32, over 2015.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:47:14,844 INFO [train.py:845] (3/4) Start epoch 977 2024-02-24 18:50:43,754 INFO [train.py:845] (3/4) Start epoch 978 2024-02-24 18:51:02,921 INFO [train.py:471] (3/4) Epoch 978, batch 1, global_batch_idx: 36150, batch size: 56, loss[discriminator_loss=2.443, discriminator_real_loss=1.242, discriminator_fake_loss=1.201, generator_loss=30.75, generator_mel_loss=19.56, generator_kl_loss=2.054, generator_dur_loss=1.47, generator_adv_loss=2.406, generator_feat_match_loss=5.262, over 56.00 samples.], tot_loss[discriminator_loss=2.458, discriminator_real_loss=1.243, discriminator_fake_loss=1.216, generator_loss=30.42, generator_mel_loss=19.42, generator_kl_loss=2.022, generator_dur_loss=1.481, generator_adv_loss=2.367, generator_feat_match_loss=5.129, over 114.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:54:09,687 INFO [train.py:845] (3/4) Start epoch 979 2024-02-24 18:55:34,895 INFO [train.py:471] (3/4) Epoch 979, batch 14, global_batch_idx: 36200, batch size: 53, loss[discriminator_loss=2.307, discriminator_real_loss=1.151, discriminator_fake_loss=1.155, generator_loss=30.63, generator_mel_loss=18.89, generator_kl_loss=1.884, generator_dur_loss=1.469, generator_adv_loss=2.527, generator_feat_match_loss=5.863, over 53.00 samples.], tot_loss[discriminator_loss=2.366, discriminator_real_loss=1.209, discriminator_fake_loss=1.157, generator_loss=30.96, generator_mel_loss=19.18, generator_kl_loss=1.966, generator_dur_loss=1.468, generator_adv_loss=2.595, generator_feat_match_loss=5.757, over 1086.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 18:55:34,897 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 18:55:43,392 INFO [train.py:534] (3/4) Epoch 979, validation: discriminator_loss=2.281, discriminator_real_loss=1.066, discriminator_fake_loss=1.215, generator_loss=31.61, generator_mel_loss=19.61, generator_kl_loss=2.126, generator_dur_loss=1.469, generator_adv_loss=2.501, generator_feat_match_loss=5.898, over 100.00 samples. 2024-02-24 18:55:43,393 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 18:57:43,287 INFO [train.py:845] (3/4) Start epoch 980 2024-02-24 19:00:26,679 INFO [train.py:471] (3/4) Epoch 980, batch 27, global_batch_idx: 36250, batch size: 63, loss[discriminator_loss=2.457, discriminator_real_loss=1.203, discriminator_fake_loss=1.255, generator_loss=30.19, generator_mel_loss=18.8, generator_kl_loss=1.914, generator_dur_loss=1.482, generator_adv_loss=2.682, generator_feat_match_loss=5.305, over 63.00 samples.], tot_loss[discriminator_loss=2.375, discriminator_real_loss=1.2, discriminator_fake_loss=1.175, generator_loss=30.56, generator_mel_loss=18.98, generator_kl_loss=1.956, generator_dur_loss=1.466, generator_adv_loss=2.531, generator_feat_match_loss=5.626, over 2036.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:01:13,310 INFO [train.py:845] (3/4) Start epoch 981 2024-02-24 19:04:44,204 INFO [train.py:845] (3/4) Start epoch 982 2024-02-24 19:05:13,004 INFO [train.py:471] (3/4) Epoch 982, batch 3, global_batch_idx: 36300, batch size: 64, loss[discriminator_loss=2.328, discriminator_real_loss=1.235, discriminator_fake_loss=1.092, generator_loss=30.85, generator_mel_loss=19.41, generator_kl_loss=1.988, generator_dur_loss=1.46, generator_adv_loss=2.467, generator_feat_match_loss=5.527, over 64.00 samples.], tot_loss[discriminator_loss=2.323, discriminator_real_loss=1.223, discriminator_fake_loss=1.099, generator_loss=30.93, generator_mel_loss=19.38, generator_kl_loss=1.992, generator_dur_loss=1.462, generator_adv_loss=2.504, generator_feat_match_loss=5.596, over 285.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:08:13,959 INFO [train.py:845] (3/4) Start epoch 983 2024-02-24 19:09:55,363 INFO [train.py:471] (3/4) Epoch 983, batch 16, global_batch_idx: 36350, batch size: 53, loss[discriminator_loss=2.512, discriminator_real_loss=1.35, discriminator_fake_loss=1.162, generator_loss=30.31, generator_mel_loss=19.13, generator_kl_loss=1.919, generator_dur_loss=1.468, generator_adv_loss=2.547, generator_feat_match_loss=5.242, over 53.00 samples.], tot_loss[discriminator_loss=2.4, discriminator_real_loss=1.219, discriminator_fake_loss=1.181, generator_loss=30.73, generator_mel_loss=19.03, generator_kl_loss=1.966, generator_dur_loss=1.465, generator_adv_loss=2.552, generator_feat_match_loss=5.721, over 1275.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:11:44,595 INFO [train.py:845] (3/4) Start epoch 984 2024-02-24 19:14:34,343 INFO [train.py:471] (3/4) Epoch 984, batch 29, global_batch_idx: 36400, batch size: 53, loss[discriminator_loss=2.391, discriminator_real_loss=1.247, discriminator_fake_loss=1.145, generator_loss=31.02, generator_mel_loss=19.11, generator_kl_loss=2.078, generator_dur_loss=1.507, generator_adv_loss=2.506, generator_feat_match_loss=5.816, over 53.00 samples.], tot_loss[discriminator_loss=2.344, discriminator_real_loss=1.182, discriminator_fake_loss=1.162, generator_loss=30.78, generator_mel_loss=19.07, generator_kl_loss=1.983, generator_dur_loss=1.463, generator_adv_loss=2.532, generator_feat_match_loss=5.739, over 2305.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2024-02-24 19:14:34,345 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 19:14:43,201 INFO [train.py:534] (3/4) Epoch 984, validation: discriminator_loss=2.285, discriminator_real_loss=1.101, discriminator_fake_loss=1.184, generator_loss=31.51, generator_mel_loss=19.53, generator_kl_loss=2.212, generator_dur_loss=1.46, generator_adv_loss=2.452, generator_feat_match_loss=5.857, over 100.00 samples. 2024-02-24 19:14:43,202 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 19:15:22,931 INFO [train.py:845] (3/4) Start epoch 985 2024-02-24 19:18:54,882 INFO [train.py:845] (3/4) Start epoch 986 2024-02-24 19:19:37,276 INFO [train.py:471] (3/4) Epoch 986, batch 5, global_batch_idx: 36450, batch size: 64, loss[discriminator_loss=2.658, discriminator_real_loss=1.268, discriminator_fake_loss=1.391, generator_loss=29.4, generator_mel_loss=18.75, generator_kl_loss=1.996, generator_dur_loss=1.436, generator_adv_loss=2.363, generator_feat_match_loss=4.859, over 64.00 samples.], tot_loss[discriminator_loss=2.376, discriminator_real_loss=1.164, discriminator_fake_loss=1.212, generator_loss=30.87, generator_mel_loss=19.1, generator_kl_loss=1.976, generator_dur_loss=1.456, generator_adv_loss=2.553, generator_feat_match_loss=5.783, over 485.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:22:19,391 INFO [train.py:845] (3/4) Start epoch 987 2024-02-24 19:24:16,754 INFO [train.py:471] (3/4) Epoch 987, batch 18, global_batch_idx: 36500, batch size: 153, loss[discriminator_loss=2.414, discriminator_real_loss=1.209, discriminator_fake_loss=1.206, generator_loss=30.15, generator_mel_loss=18.74, generator_kl_loss=1.992, generator_dur_loss=1.427, generator_adv_loss=2.496, generator_feat_match_loss=5.5, over 153.00 samples.], tot_loss[discriminator_loss=2.402, discriminator_real_loss=1.203, discriminator_fake_loss=1.199, generator_loss=30.19, generator_mel_loss=18.83, generator_kl_loss=1.958, generator_dur_loss=1.455, generator_adv_loss=2.447, generator_feat_match_loss=5.507, over 1418.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:25:47,604 INFO [train.py:845] (3/4) Start epoch 988 2024-02-24 19:28:54,695 INFO [train.py:471] (3/4) Epoch 988, batch 31, global_batch_idx: 36550, batch size: 85, loss[discriminator_loss=2.523, discriminator_real_loss=1.28, discriminator_fake_loss=1.244, generator_loss=30.72, generator_mel_loss=19.75, generator_kl_loss=1.972, generator_dur_loss=1.465, generator_adv_loss=2.408, generator_feat_match_loss=5.125, over 85.00 samples.], tot_loss[discriminator_loss=2.468, discriminator_real_loss=1.242, discriminator_fake_loss=1.226, generator_loss=30.6, generator_mel_loss=19.59, generator_kl_loss=2.001, generator_dur_loss=1.46, generator_adv_loss=2.351, generator_feat_match_loss=5.201, over 2485.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:29:18,915 INFO [train.py:845] (3/4) Start epoch 989 2024-02-24 19:32:50,211 INFO [train.py:845] (3/4) Start epoch 990 2024-02-24 19:33:40,825 INFO [train.py:471] (3/4) Epoch 990, batch 7, global_batch_idx: 36600, batch size: 54, loss[discriminator_loss=2.193, discriminator_real_loss=1.017, discriminator_fake_loss=1.177, generator_loss=31.32, generator_mel_loss=19.13, generator_kl_loss=2.061, generator_dur_loss=1.482, generator_adv_loss=2.594, generator_feat_match_loss=6.055, over 54.00 samples.], tot_loss[discriminator_loss=2.439, discriminator_real_loss=1.19, discriminator_fake_loss=1.249, generator_loss=30.41, generator_mel_loss=18.97, generator_kl_loss=1.963, generator_dur_loss=1.459, generator_adv_loss=2.493, generator_feat_match_loss=5.531, over 623.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:33:40,827 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 19:33:49,518 INFO [train.py:534] (3/4) Epoch 990, validation: discriminator_loss=2.207, discriminator_real_loss=1.045, discriminator_fake_loss=1.162, generator_loss=31.92, generator_mel_loss=19.62, generator_kl_loss=2.194, generator_dur_loss=1.468, generator_adv_loss=2.473, generator_feat_match_loss=6.168, over 100.00 samples. 2024-02-24 19:33:49,519 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 19:36:30,939 INFO [train.py:845] (3/4) Start epoch 991 2024-02-24 19:38:36,942 INFO [train.py:471] (3/4) Epoch 991, batch 20, global_batch_idx: 36650, batch size: 154, loss[discriminator_loss=2.328, discriminator_real_loss=1.199, discriminator_fake_loss=1.13, generator_loss=31.2, generator_mel_loss=19.28, generator_kl_loss=1.94, generator_dur_loss=1.445, generator_adv_loss=2.549, generator_feat_match_loss=5.988, over 154.00 samples.], tot_loss[discriminator_loss=2.358, discriminator_real_loss=1.19, discriminator_fake_loss=1.168, generator_loss=30.55, generator_mel_loss=19.04, generator_kl_loss=1.949, generator_dur_loss=1.463, generator_adv_loss=2.483, generator_feat_match_loss=5.613, over 1573.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:40:03,197 INFO [train.py:845] (3/4) Start epoch 992 2024-02-24 19:43:06,844 INFO [train.py:471] (3/4) Epoch 992, batch 33, global_batch_idx: 36700, batch size: 63, loss[discriminator_loss=2.344, discriminator_real_loss=1.175, discriminator_fake_loss=1.169, generator_loss=30.44, generator_mel_loss=19.04, generator_kl_loss=1.894, generator_dur_loss=1.451, generator_adv_loss=2.355, generator_feat_match_loss=5.699, over 63.00 samples.], tot_loss[discriminator_loss=2.32, discriminator_real_loss=1.17, discriminator_fake_loss=1.15, generator_loss=30.76, generator_mel_loss=18.94, generator_kl_loss=1.977, generator_dur_loss=1.466, generator_adv_loss=2.557, generator_feat_match_loss=5.828, over 2501.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:43:27,637 INFO [train.py:845] (3/4) Start epoch 993 2024-02-24 19:46:57,246 INFO [train.py:845] (3/4) Start epoch 994 2024-02-24 19:48:00,389 INFO [train.py:471] (3/4) Epoch 994, batch 9, global_batch_idx: 36750, batch size: 110, loss[discriminator_loss=2.262, discriminator_real_loss=1.099, discriminator_fake_loss=1.162, generator_loss=31.38, generator_mel_loss=19.05, generator_kl_loss=1.987, generator_dur_loss=1.443, generator_adv_loss=2.734, generator_feat_match_loss=6.164, over 110.00 samples.], tot_loss[discriminator_loss=2.313, discriminator_real_loss=1.177, discriminator_fake_loss=1.136, generator_loss=30.84, generator_mel_loss=18.96, generator_kl_loss=1.972, generator_dur_loss=1.463, generator_adv_loss=2.564, generator_feat_match_loss=5.875, over 879.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 8.0 2024-02-24 19:50:22,736 INFO [train.py:845] (3/4) Start epoch 995 2024-02-24 19:52:40,585 INFO [train.py:471] (3/4) Epoch 995, batch 22, global_batch_idx: 36800, batch size: 71, loss[discriminator_loss=2.297, discriminator_real_loss=1.244, discriminator_fake_loss=1.054, generator_loss=30.51, generator_mel_loss=18.81, generator_kl_loss=2.035, generator_dur_loss=1.442, generator_adv_loss=2.656, generator_feat_match_loss=5.57, over 71.00 samples.], tot_loss[discriminator_loss=2.322, discriminator_real_loss=1.176, discriminator_fake_loss=1.146, generator_loss=30.7, generator_mel_loss=18.93, generator_kl_loss=2.003, generator_dur_loss=1.456, generator_adv_loss=2.532, generator_feat_match_loss=5.775, over 1849.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2024-02-24 19:52:40,587 INFO [train.py:525] (3/4) Computing validation loss 2024-02-24 19:52:48,226 INFO [train.py:534] (3/4) Epoch 995, validation: discriminator_loss=2.353, discriminator_real_loss=1.165, discriminator_fake_loss=1.188, generator_loss=31.28, generator_mel_loss=19.39, generator_kl_loss=2.096, generator_dur_loss=1.463, generator_adv_loss=2.517, generator_feat_match_loss=5.814, over 100.00 samples. 2024-02-24 19:52:48,227 INFO [train.py:535] (3/4) Maximum memory allocated so far is 28218MB 2024-02-24 19:53:58,496 INFO [train.py:845] (3/4) Start epoch 996 2024-02-24 19:57:25,935 INFO [train.py:471] (3/4) Epoch 996, batch 35, global_batch_idx: 36850, batch size: 73, loss[discriminator_loss=2.328, discriminator_real_loss=1.118, discriminator_fake_loss=1.209, generator_loss=30.7, generator_mel_loss=18.93, generator_kl_loss=1.951, generator_dur_loss=1.472, generator_adv_loss=2.566, generator_feat_match_loss=5.785, over 73.00 samples.], tot_loss[discriminator_loss=2.354, discriminator_real_loss=1.188, discriminator_fake_loss=1.166, generator_loss=30.75, generator_mel_loss=18.95, generator_kl_loss=1.966, generator_dur_loss=1.46, generator_adv_loss=2.539, generator_feat_match_loss=5.835, over 2726.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2024-02-24 19:57:31,450 INFO [train.py:845] (3/4) Start epoch 997 2024-02-24 20:01:02,202 INFO [train.py:845] (3/4) Start epoch 998 2024-02-24 20:02:20,952 INFO [train.py:471] (3/4) Epoch 998, batch 11, global_batch_idx: 36900, batch size: 65, loss[discriminator_loss=2.227, discriminator_real_loss=1.077, discriminator_fake_loss=1.148, generator_loss=31.14, generator_mel_loss=18.93, generator_kl_loss=2.005, generator_dur_loss=1.474, generator_adv_loss=2.488, generator_feat_match_loss=6.238, over 65.00 samples.], tot_loss[discriminator_loss=2.373, discriminator_real_loss=1.193, discriminator_fake_loss=1.18, generator_loss=30.66, generator_mel_loss=18.93, generator_kl_loss=1.95, generator_dur_loss=1.461, generator_adv_loss=2.523, generator_feat_match_loss=5.787, over 933.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2024-02-24 20:04:35,580 INFO [train.py:845] (3/4) Start epoch 999 2024-02-24 20:06:56,218 INFO [train.py:471] (3/4) Epoch 999, batch 24, global_batch_idx: 36950, batch size: 69, loss[discriminator_loss=2.281, discriminator_real_loss=1.197, discriminator_fake_loss=1.084, generator_loss=30.81, generator_mel_loss=18.89, generator_kl_loss=2.062, generator_dur_loss=1.448, generator_adv_loss=2.525, generator_feat_match_loss=5.891, over 69.00 samples.], tot_loss[discriminator_loss=2.375, discriminator_real_loss=1.202, discriminator_fake_loss=1.173, generator_loss=30.73, generator_mel_loss=18.99, generator_kl_loss=1.972, generator_dur_loss=1.466, generator_adv_loss=2.543, generator_feat_match_loss=5.754, over 1752.00 samples.], cur_lr_g: 1.77e-04, cur_lr_d: 1.77e-04, grad_scale: 16.0 2024-02-24 20:08:03,068 INFO [train.py:845] (3/4) Start epoch 1000 2024-02-24 20:11:32,948 INFO [train.py:902] (3/4) Done!