----------------- Options ---------------
batch_size: 1
beta1: 0.5
checkpoints_dir: main [default: ./checkpoints]
continue_train: False
crop_size: 256
dataroot: ./datasets/new_seq_data/ [default: None]
dataset_mode: aligned
direction: AtoB
display_env: main
display_freq: 400
display_id: 1
display_ncols: 4
display_port: 8097
display_server: http://localhost
display_winsize: 256
epoch: latest
epoch_count: 1
gan_mode: vanilla
gpu_ids: 0
init_gain: 0.02
init_type: normal
input_nc: 3
isTrain: True [default: None]
lambda_L1: 100.0
load_iter: 0 [default: 0]
load_size: 286
lr: 0.002 [default: 0.0002]
lr_decay_iters: 50
lr_policy: linear
max_dataset_size: inf
model: pix2pix [default: cycle_gan]
n_epochs: 20 [default: 100]
n_epochs_decay: 1 [default: 100]
n_layers_D: 3
name: nuclei [default: experiment_name]
ndf: 64
netD: basic
netG: unet_256
ngf: 64
no_dropout: False
no_flip: False
no_html: False
norm: batch
num_threads: 4
output_nc: 3
phase: train
pool_size: 0
preprocess: resize_and_crop
print_freq: 100
save_by_iter: False
save_epoch_freq: 1 [default: 5]
save_latest_freq: 5000
serial_batches: False
suffix:
update_html_freq: 1000
use_wandb: False
verbose: False
wandb_project_name: CycleGAN-and-pix2pix
----------------- End -------------------
dataset [AlignedDataset] was created
/usr/local/lib/python3.10/dist-packages/torch/utils/data/dataloader.py:558: UserWarning: This DataLoader will create 4 worker processes in total. Our suggested max number of worker in current system is 2, which is smaller than what this DataLoader is going to create. Please be aware that excessive worker creation might get DataLoader running slow or even freeze, lower the worker number to avoid potential slowness/freeze if necessary.
warnings.warn(_create_warning_msg(
The number of training images = 427
initialize network with normal
initialize network with normal
model [Pix2PixModel] was created
---------- Networks initialized -------------
[Network G] Total number of parameters : 54.414 M
[Network D] Total number of parameters : 2.769 M
-----------------------------------------------
Setting up a new session...
create web directory main/nuclei/web...
/usr/local/lib/python3.10/dist-packages/torch/optim/lr_scheduler.py:143: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
warnings.warn("Detected call of `lr_scheduler.step()` before `optimizer.step()`. "
learning rate 0.0020000 -> 0.0020000
/usr/lib/python3.10/multiprocessing/popen_fork.py:66: RuntimeWarning: os.fork() was called. os.fork() is incompatible with multithreaded code, and JAX is multithreaded, so this will likely lead to a deadlock.
self.pid = os.fork()
(epoch: 1, iters: 100, time: 0.087, data: 0.143) G_GAN: 0.586 G_L1: 6.048 D_real: 1.075 D_fake: 0.677
(epoch: 1, iters: 200, time: 0.088, data: 0.002) G_GAN: 1.217 G_L1: 3.644 D_real: 1.859 D_fake: 0.247
(epoch: 1, iters: 300, time: 0.076, data: 0.002) G_GAN: 0.964 G_L1: 10.871 D_real: 0.556 D_fake: 0.527
(epoch: 1, iters: 400, time: 0.264, data: 0.009) G_GAN: 0.988 G_L1: 8.205 D_real: 0.432 D_fake: 0.529
saving the model at the end of epoch 1, iters 427
End of epoch 1 / 21 Time Taken: 33 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 2, iters: 73, time: 0.088, data: 0.002) G_GAN: 1.048 G_L1: 4.106 D_real: 1.020 D_fake: 0.405
(epoch: 2, iters: 173, time: 0.089, data: 0.002) G_GAN: 0.919 G_L1: 4.347 D_real: 0.280 D_fake: 0.596
(epoch: 2, iters: 273, time: 0.089, data: 0.002) G_GAN: 1.125 G_L1: 15.107 D_real: 0.027 D_fake: 0.487
(epoch: 2, iters: 373, time: 0.260, data: 0.002) G_GAN: 0.691 G_L1: 5.461 D_real: 0.378 D_fake: 0.756
saving the model at the end of epoch 2, iters 854
End of epoch 2 / 21 Time Taken: 25 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 3, iters: 46, time: 0.091, data: 0.002) G_GAN: 1.146 G_L1: 9.877 D_real: 0.018 D_fake: 0.935
(epoch: 3, iters: 146, time: 0.086, data: 0.002) G_GAN: 1.050 G_L1: 6.668 D_real: 0.314 D_fake: 1.927
(epoch: 3, iters: 246, time: 0.092, data: 0.002) G_GAN: 1.341 G_L1: 6.983 D_real: 1.525 D_fake: 0.290
(epoch: 3, iters: 346, time: 0.252, data: 0.002) G_GAN: 1.493 G_L1: 15.646 D_real: 0.015 D_fake: 0.344
saving the model at the end of epoch 3, iters 1281
End of epoch 3 / 21 Time Taken: 30 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 4, iters: 19, time: 0.091, data: 0.002) G_GAN: 1.207 G_L1: 9.371 D_real: 0.062 D_fake: 0.463
(epoch: 4, iters: 119, time: 0.082, data: 0.002) G_GAN: 0.773 G_L1: 2.439 D_real: 0.641 D_fake: 0.514
(epoch: 4, iters: 219, time: 0.094, data: 0.002) G_GAN: 1.282 G_L1: 4.250 D_real: 0.276 D_fake: 0.403
(epoch: 4, iters: 319, time: 0.291, data: 0.002) G_GAN: 1.035 G_L1: 5.616 D_real: 1.344 D_fake: 0.468
(epoch: 4, iters: 419, time: 0.094, data: 0.002) G_GAN: 1.019 G_L1: 9.700 D_real: 0.065 D_fake: 0.586
saving the model at the end of epoch 4, iters 1708
End of epoch 4 / 21 Time Taken: 25 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 5, iters: 92, time: 0.084, data: 0.002) G_GAN: 0.994 G_L1: 3.801 D_real: 1.195 D_fake: 0.414
(epoch: 5, iters: 192, time: 0.097, data: 0.009) G_GAN: 1.117 G_L1: 2.773 D_real: 1.197 D_fake: 0.373
(epoch: 5, iters: 292, time: 0.309, data: 0.002) G_GAN: 0.665 G_L1: 6.393 D_real: 0.675 D_fake: 0.799
(epoch: 5, iters: 392, time: 0.097, data: 0.009) G_GAN: 0.937 G_L1: 4.719 D_real: 0.594 D_fake: 0.457
saving the model at the end of epoch 5, iters 2135
End of epoch 5 / 21 Time Taken: 26 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 6, iters: 65, time: 0.087, data: 0.002) G_GAN: 0.696 G_L1: 2.763 D_real: 0.747 D_fake: 0.720
(epoch: 6, iters: 165, time: 0.096, data: 0.003) G_GAN: 0.844 G_L1: 2.969 D_real: 1.070 D_fake: 0.446
(epoch: 6, iters: 265, time: 0.325, data: 0.002) G_GAN: 0.983 G_L1: 7.875 D_real: 0.426 D_fake: 0.544
(epoch: 6, iters: 365, time: 0.093, data: 0.013) G_GAN: 0.785 G_L1: 2.602 D_real: 0.794 D_fake: 0.555
saving the model at the end of epoch 6, iters 2562
End of epoch 6 / 21 Time Taken: 26 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 7, iters: 38, time: 0.089, data: 0.003) G_GAN: 1.126 G_L1: 4.470 D_real: 0.341 D_fake: 0.397
(epoch: 7, iters: 138, time: 0.095, data: 0.009) G_GAN: 0.838 G_L1: 3.879 D_real: 0.982 D_fake: 0.578
(epoch: 7, iters: 238, time: 0.319, data: 0.002) G_GAN: 0.689 G_L1: 2.480 D_real: 0.746 D_fake: 0.725
(epoch: 7, iters: 338, time: 0.096, data: 0.002) G_GAN: 0.822 G_L1: 6.051 D_real: 1.074 D_fake: 0.515
saving the model at the end of epoch 7, iters 2989
End of epoch 7 / 21 Time Taken: 29 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 8, iters: 11, time: 0.094, data: 0.002) G_GAN: 1.011 G_L1: 3.824 D_real: 1.336 D_fake: 0.394
(epoch: 8, iters: 111, time: 0.085, data: 0.002) G_GAN: 0.746 G_L1: 2.095 D_real: 0.815 D_fake: 0.595
(epoch: 8, iters: 211, time: 0.268, data: 0.002) G_GAN: 0.974 G_L1: 5.989 D_real: 0.816 D_fake: 0.460
(epoch: 8, iters: 311, time: 0.090, data: 0.002) G_GAN: 0.948 G_L1: 2.710 D_real: 0.977 D_fake: 0.456
(epoch: 8, iters: 411, time: 0.097, data: 0.002) G_GAN: 1.044 G_L1: 8.212 D_real: 0.044 D_fake: 0.665
saving the model at the end of epoch 8, iters 3416
End of epoch 8 / 21 Time Taken: 29 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 9, iters: 84, time: 0.088, data: 0.002) G_GAN: 0.557 G_L1: 2.788 D_real: 0.628 D_fake: 0.998
(epoch: 9, iters: 184, time: 0.263, data: 0.002) G_GAN: 1.392 G_L1: 7.497 D_real: 0.099 D_fake: 0.713
(epoch: 9, iters: 284, time: 0.085, data: 0.002) G_GAN: 0.761 G_L1: 2.846 D_real: 0.830 D_fake: 0.563
(epoch: 9, iters: 384, time: 0.096, data: 0.002) G_GAN: 0.758 G_L1: 3.210 D_real: 0.932 D_fake: 0.480
saving the model at the end of epoch 9, iters 3843
End of epoch 9 / 21 Time Taken: 26 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 10, iters: 57, time: 0.086, data: 0.002) G_GAN: 0.839 G_L1: 6.461 D_real: 0.836 D_fake: 0.360
(epoch: 10, iters: 157, time: 0.259, data: 0.008) G_GAN: 0.757 G_L1: 2.805 D_real: 1.042 D_fake: 0.395
(epoch: 10, iters: 257, time: 0.085, data: 0.002) G_GAN: 0.979 G_L1: 2.656 D_real: 1.123 D_fake: 0.373
(epoch: 10, iters: 357, time: 0.095, data: 0.010) G_GAN: 0.694 G_L1: 4.284 D_real: 0.616 D_fake: 0.649
saving the model at the end of epoch 10, iters 4270
End of epoch 10 / 21 Time Taken: 26 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 11, iters: 30, time: 0.091, data: 0.002) G_GAN: 0.870 G_L1: 2.525 D_real: 1.023 D_fake: 0.489
(epoch: 11, iters: 130, time: 0.287, data: 0.002) G_GAN: 1.909 G_L1: 8.800 D_real: 0.337 D_fake: 0.290
(epoch: 11, iters: 230, time: 0.064, data: 0.002) G_GAN: 1.143 G_L1: 6.079 D_real: 0.193 D_fake: 0.833
(epoch: 11, iters: 330, time: 0.096, data: 0.004) G_GAN: 0.710 G_L1: 3.080 D_real: 0.538 D_fake: 0.844
saving the model at the end of epoch 11, iters 4697
End of epoch 11 / 21 Time Taken: 27 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 12, iters: 3, time: 0.134, data: 0.002) G_GAN: 1.131 G_L1: 5.544 D_real: 0.818 D_fake: 0.314
(epoch: 12, iters: 103, time: 0.277, data: 0.001) G_GAN: 1.229 G_L1: 9.888 D_real: 0.029 D_fake: 0.606
(epoch: 12, iters: 203, time: 0.096, data: 0.002) G_GAN: 1.264 G_L1: 6.673 D_real: 0.229 D_fake: 0.409
(epoch: 12, iters: 303, time: 0.095, data: 0.002) G_GAN: 0.899 G_L1: 4.113 D_real: 1.100 D_fake: 0.407
saving the latest model (epoch 12, total_iters 5000)
(epoch: 12, iters: 403, time: 0.093, data: 0.002) G_GAN: 1.282 G_L1: 4.151 D_real: 1.146 D_fake: 0.310
saving the model at the end of epoch 12, iters 5124
End of epoch 12 / 21 Time Taken: 27 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 13, iters: 76, time: 0.281, data: 0.002) G_GAN: 0.706 G_L1: 3.928 D_real: 0.790 D_fake: 0.592
(epoch: 13, iters: 176, time: 0.092, data: 0.002) G_GAN: 0.800 G_L1: 2.818 D_real: 0.867 D_fake: 0.532
(epoch: 13, iters: 276, time: 0.077, data: 0.002) G_GAN: 0.897 G_L1: 4.423 D_real: 0.567 D_fake: 0.503
(epoch: 13, iters: 376, time: 0.098, data: 0.002) G_GAN: 1.138 G_L1: 2.300 D_real: 1.324 D_fake: 0.321
saving the model at the end of epoch 13, iters 5551
End of epoch 13 / 21 Time Taken: 26 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 14, iters: 49, time: 0.278, data: 0.002) G_GAN: 0.989 G_L1: 5.559 D_real: 0.322 D_fake: 0.499
(epoch: 14, iters: 149, time: 0.095, data: 0.002) G_GAN: 0.609 G_L1: 2.728 D_real: 0.581 D_fake: 0.750
(epoch: 14, iters: 249, time: 0.096, data: 0.002) G_GAN: 0.769 G_L1: 3.295 D_real: 0.604 D_fake: 0.666
(epoch: 14, iters: 349, time: 0.095, data: 0.002) G_GAN: 0.733 G_L1: 1.984 D_real: 0.612 D_fake: 0.657
saving the model at the end of epoch 14, iters 5978
End of epoch 14 / 21 Time Taken: 26 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 15, iters: 22, time: 0.293, data: 0.002) G_GAN: 1.267 G_L1: 9.006 D_real: 0.164 D_fake: 0.821
(epoch: 15, iters: 122, time: 0.094, data: 0.002) G_GAN: 0.765 G_L1: 3.335 D_real: 0.525 D_fake: 0.713
(epoch: 15, iters: 222, time: 0.095, data: 0.002) G_GAN: 1.041 G_L1: 3.959 D_real: 0.275 D_fake: 0.745
(epoch: 15, iters: 322, time: 0.094, data: 0.002) G_GAN: 0.775 G_L1: 1.818 D_real: 0.439 D_fake: 0.830
(epoch: 15, iters: 422, time: 0.191, data: 0.002) G_GAN: 0.763 G_L1: 2.661 D_real: 0.854 D_fake: 0.431
saving the model at the end of epoch 15, iters 6405
End of epoch 15 / 21 Time Taken: 26 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 16, iters: 95, time: 0.095, data: 0.002) G_GAN: 0.902 G_L1: 3.343 D_real: 0.700 D_fake: 0.473
(epoch: 16, iters: 195, time: 0.094, data: 0.002) G_GAN: 1.415 G_L1: 7.038 D_real: 0.964 D_fake: 0.259
(epoch: 16, iters: 295, time: 0.096, data: 0.002) G_GAN: 0.953 G_L1: 2.611 D_real: 0.706 D_fake: 0.494
(epoch: 16, iters: 395, time: 0.289, data: 0.002) G_GAN: 1.028 G_L1: 6.327 D_real: 0.241 D_fake: 0.836
saving the model at the end of epoch 16, iters 6832
End of epoch 16 / 21 Time Taken: 26 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 17, iters: 68, time: 0.095, data: 0.002) G_GAN: 0.829 G_L1: 3.864 D_real: 0.436 D_fake: 0.853
(epoch: 17, iters: 168, time: 0.094, data: 0.002) G_GAN: 1.207 G_L1: 8.182 D_real: 0.393 D_fake: 0.590
(epoch: 17, iters: 268, time: 0.095, data: 0.002) G_GAN: 0.980 G_L1: 2.963 D_real: 0.658 D_fake: 0.628
(epoch: 17, iters: 368, time: 0.278, data: 0.002) G_GAN: 0.822 G_L1: 2.677 D_real: 0.688 D_fake: 0.600
saving the model at the end of epoch 17, iters 7259
End of epoch 17 / 21 Time Taken: 26 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 18, iters: 41, time: 0.094, data: 0.002) G_GAN: 1.819 G_L1: 9.183 D_real: 0.146 D_fake: 0.309
(epoch: 18, iters: 141, time: 0.097, data: 0.002) G_GAN: 1.119 G_L1: 4.914 D_real: 0.192 D_fake: 0.625
(epoch: 18, iters: 241, time: 0.096, data: 0.002) G_GAN: 0.628 G_L1: 4.281 D_real: 0.358 D_fake: 0.871
(epoch: 18, iters: 341, time: 0.446, data: 0.002) G_GAN: 0.917 G_L1: 2.483 D_real: 0.896 D_fake: 0.492
saving the model at the end of epoch 18, iters 7686
End of epoch 18 / 21 Time Taken: 27 sec
learning rate 0.0020000 -> 0.0020000
(epoch: 19, iters: 14, time: 0.093, data: 0.002) G_GAN: 0.789 G_L1: 2.937 D_real: 0.839 D_fake: 0.601
(epoch: 19, iters: 114, time: 0.096, data: 0.002) G_GAN: 0.639 G_L1: 4.513 D_real: 0.291 D_fake: 0.892
(epoch: 19, iters: 214, time: 0.096, data: 0.002) G_GAN: 1.036 G_L1: 4.827 D_real: 0.296 D_fake: 0.605
(epoch: 19, iters: 314, time: 0.302, data: 0.002) G_GAN: 0.604 G_L1: 2.776 D_real: 0.747 D_fake: 0.665
(epoch: 19, iters: 414, time: 0.094, data: 0.002) G_GAN: 1.134 G_L1: 13.114 D_real: 0.044 D_fake: 0.591
saving the model at the end of epoch 19, iters 8113
End of epoch 19 / 21 Time Taken: 29 sec
learning rate 0.0020000 -> 0.0010000
(epoch: 20, iters: 87, time: 0.079, data: 0.002) G_GAN: 0.766 G_L1: 3.552 D_real: 0.679 D_fake: 0.664
(epoch: 20, iters: 187, time: 0.094, data: 0.002) G_GAN: 0.753 G_L1: 2.587 D_real: 0.617 D_fake: 0.657
(epoch: 20, iters: 287, time: 0.281, data: 0.002) G_GAN: 0.835 G_L1: 5.238 D_real: 0.179 D_fake: 0.807
(epoch: 20, iters: 387, time: 0.092, data: 0.002) G_GAN: 0.806 G_L1: 3.854 D_real: 0.622 D_fake: 0.597
saving the model at the end of epoch 20, iters 8540
End of epoch 20 / 21 Time Taken: 26 sec
learning rate 0.0010000 -> 0.0000000
(epoch: 21, iters: 60, time: 0.081, data: 0.002) G_GAN: 1.085 G_L1: 2.729 D_real: 0.832 D_fake: 0.426
(epoch: 21, iters: 160, time: 0.096, data: 0.002) G_GAN: 1.188 G_L1: 6.906 D_real: 0.538 D_fake: 0.395
(epoch: 21, iters: 260, time: 0.304, data: 0.002) G_GAN: 0.840 G_L1: 3.016 D_real: 0.782 D_fake: 0.592
(epoch: 21, iters: 360, time: 0.095, data: 0.014) G_GAN: 0.890 G_L1: 6.335 D_real: 0.433 D_fake: 0.604
saving the model at the end of epoch 21, iters 8967
End of epoch 21 / 21 Time Taken: 28 sec