Last active
March 8, 2019 19:07
-
-
Save vsuthichai/0d1524f03ed726bf1e6eb627062de5bc to your computer and use it in GitHub Desktop.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
+ VENV=tensorflow_p36_13rc1 | |
+ git branch | |
+ grep '*' | |
+ awk '{print $2}' | |
+ git log | |
+ head -1 | |
++ basename ./no_batch_train_1node_16xl_convergence.sh | |
+ cp no_batch_train_1node_16xl_convergence.sh /home/ubuntu/logs/train_log_20190308_185758 | |
+ env | |
+ pip freeze | |
You are using pip version 19.0.2, however version 19.0.3 is available. | |
You should consider upgrading via the 'pip install --upgrade pip' command. | |
+ ldd /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/libtensorflow_framework.so | |
+ HOROVOD_TIMELINE=/home/ubuntu/logs/train_log_20190308_185758/htimeline.json | |
+ HOROVOD_CYCLE_TIME=0.5 | |
+ HOROVOD_FUSION_THRESHOLD=67108864 | |
+ /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/bin/mpirun -np 8 -H localhost:8 -wdir /home/ubuntu/tensorpack-mask-rcnn --mca plm_rsh_no_tree_spawn 1 -bind-to none -map-by slot -mca pml ob1 -mca btl '^openib' -mca btl_tcp_if_exclude lo,docker0 -mca btl_vader_single_copy_mechanism none -x 'NCCL_SOCKET_IFNAME=^docker0,lo' -x NCCL_MIN_NRINGS=8 -x NCCL_DEBUG=INFO -x LD_LIBRARY_PATH -x PATH -x HOROVOD_CYCLE_TIME -x HOROVOD_FUSION_THRESHOLD --output-filename /home/ubuntu/logs/train_log_20190308_185758/mpirun_logs /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/bin/python3 -m MaskRCNN.train --logdir /home/ubuntu/logs/train_log_20190308_185758 --fp16 --perf --images_per_step 16 --throughput_log_freq 2000 --summary_period 0 --config MODE_MASK=True MODE_FPN=True DATA.BASEDIR=/home/ubuntu/data 'DATA.TRAIN=["train2017"]' 'DATA.VAL=("val2017",)' TRAIN.STEPS_PER_EPOCH=15000 'TRAIN.LR_SCHEDULE=[120000, 160000, 180000]' BACKBONE.WEIGHTS=/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz BACKBONE.NORM=FreezeBN TRAIN.BATCH_SIZE_PER_GPU=2 TRAINER=horovod | |
Limited tf.compat.v2.summary API due to missing TensorBoard installation | |
Limited tf.compat.v2.summary API due to missing TensorBoard installation | |
Limited tf.compat.v2.summary API due to missing TensorBoard installation | |
Limited tf.compat.v2.summary API due to missing TensorBoard installation | |
Limited tf.compat.v2.summary API due to missing TensorBoard installation | |
Limited tf.compat.v2.summary API due to missing TensorBoard installation | |
Limited tf.compat.v2.summary API due to missing TensorBoard installation | |
Limited tf.compat.v2.summary API due to missing TensorBoard installation | |
WARNING: The TensorFlow contrib module will not be included in TensorFlow 2.0. | |
For more information, please see: | |
* https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md | |
* https://github.com/tensorflow/addons | |
If you depend on functionality not listed there, please file an issue. | |
WARNING: The TensorFlow contrib module will not be included in TensorFlow 2.0. | |
For more information, please see: | |
* https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md | |
* https://github.com/tensorflow/addons | |
If you depend on functionality not listed there, please file an issue. | |
WARNING: The TensorFlow contrib module will not be included in TensorFlow 2.0. | |
For more information, please see: | |
* https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md | |
* https://github.com/tensorflow/addons | |
If you depend on functionality not listed there, please file an issue. | |
WARNING: The TensorFlow contrib module will not be included in TensorFlow 2.0. | |
For more information, please see: | |
* https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md | |
* https://github.com/tensorflow/addons | |
If you depend on functionality not listed there, please file an issue. | |
WARNING: The TensorFlow contrib module will not be included in TensorFlow 2.0. | |
For more information, please see: | |
* https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md | |
* https://github.com/tensorflow/addons | |
If you depend on functionality not listed there, please file an issue. | |
WARNING: The TensorFlow contrib module will not be included in TensorFlow 2.0. | |
For more information, please see: | |
* https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md | |
* https://github.com/tensorflow/addons | |
If you depend on functionality not listed there, please file an issue. | |
WARNING: The TensorFlow contrib module will not be included in TensorFlow 2.0. | |
For more information, please see: | |
* https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md | |
* https://github.com/tensorflow/addons | |
If you depend on functionality not listed there, please file an issue. | |
WARNING: The TensorFlow contrib module will not be included in TensorFlow 2.0. | |
For more information, please see: | |
* https://github.com/tensorflow/community/blob/master/rfcs/20180907-contrib-sunset.md | |
* https://github.com/tensorflow/addons | |
If you depend on functionality not listed there, please file an issue. | |
Imported ujson | |
Imported ujson | |
Imported ujson | |
Imported ujson | |
Imported ujson | |
Imported ujson | |
Imported ujson | |
Imported ujson | |
[32m[0308 18:58:01 @train.py:550][0m Horovod Rank=2, Size=8 | |
[32m[0308 18:58:01 @config.py:247][0m [5m[31mWRN[0m It's not recommended to use horovod for single-machine training. Replicated trainer is more stable and has the same efficiency. | |
[32m[0308 18:58:01 @train.py:550][0m Horovod Rank=0, Size=8 | |
[32m[0308 18:58:01 @train.py:550][0m Horovod Rank=6, Size=8 | |
[32m[0308 18:58:01 @train.py:550][0m Horovod Rank=5, Size=8 | |
[32m[0308 18:58:01 @train.py:550][0m Horovod Rank=1, Size=8 | |
[32m[0308 18:58:01 @config.py:247][0m [5m[31mWRN[0m It's not recommended to use horovod for single-machine training. Replicated trainer is more stable and has the same efficiency. | |
[32m[0308 18:58:01 @config.py:247][0m [5m[31mWRN[0m It's not recommended to use horovod for single-machine training. Replicated trainer is more stable and has the same efficiency. | |
[32m[0308 18:58:01 @config.py:247][0m [5m[31mWRN[0m It's not recommended to use horovod for single-machine training. Replicated trainer is more stable and has the same efficiency. | |
[32m[0308 18:58:01 @train.py:550][0m Horovod Rank=3, Size=8 | |
[32m[0308 18:58:01 @train.py:550][0m Horovod Rank=7, Size=8 | |
[32m[0308 18:58:01 @train.py:550][0m Horovod Rank=4, Size=8 | |
[32m[0308 18:58:01 @config.py:247][0m [5m[31mWRN[0m It's not recommended to use horovod for single-machine training. Replicated trainer is more stable and has the same efficiency. | |
[32m[0308 18:58:01 @config.py:247][0m [5m[31mWRN[0m It's not recommended to use horovod for single-machine training. Replicated trainer is more stable and has the same efficiency. | |
[32m[0308 18:58:01 @config.py:247][0m [5m[31mWRN[0m It's not recommended to use horovod for single-machine training. Replicated trainer is more stable and has the same efficiency. | |
[32m[0308 18:58:01 @logger.py:87][0m Argv: /home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/train.py --logdir /home/ubuntu/logs/train_log_20190308_185758 --fp16 --perf --images_per_step 16 --throughput_log_freq 2000 --summary_period 0 --config MODE_MASK=True MODE_FPN=True DATA.BASEDIR=/home/ubuntu/data DATA.TRAIN=["train2017"] DATA.VAL=("val2017",) TRAIN.STEPS_PER_EPOCH=15000 TRAIN.LR_SCHEDULE=[120000, 160000, 180000] BACKBONE.WEIGHTS=/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz BACKBONE.NORM=FreezeBN TRAIN.BATCH_SIZE_PER_GPU=2 TRAINER=horovod | |
[32m[0308 18:58:01 @config.py:247][0m [5m[31mWRN[0m It's not recommended to use horovod for single-machine training. Replicated trainer is more stable and has the same efficiency. | |
[32m[0308 18:58:01 @config.py:268][0m Config: ------------------------------------------ | |
{'BACKBONE': {'FREEZE_AFFINE': False, | |
'FREEZE_AT': 2, | |
'NORM': 'FreezeBN', | |
'RESNET_NUM_BLOCKS': [3, 4, 6, 3], | |
'STRIDE_1X1': False, | |
'TF_PAD_MODE': False, | |
'WEIGHTS': '/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz'}, | |
'DATA': {'BASEDIR': '/home/ubuntu/data', | |
'CLASS_NAMES': ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', | |
'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', | |
'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', | |
'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', | |
'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', | |
'baseball bat', 'baseball glove', 'skateboard', 'surfboard', | |
'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', | |
'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', | |
'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', | |
'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', | |
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', | |
'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', | |
'hair drier', 'toothbrush'], | |
'NUM_CATEGORY': 80, | |
'NUM_CLASS': 81, | |
'TRAIN': ['train2017'], | |
'VAL': ('val2017',)}, | |
'FPN': {'ANCHOR_STRIDES': (4, 8, 16, 32, 64), | |
'FRCNN_CONV_HEAD_DIM': 256, | |
'FRCNN_FC_HEAD_DIM': 1024, | |
'FRCNN_HEAD_FUNC': 'fastrcnn_2fc_head', | |
'MRCNN_HEAD_FUNC': 'maskrcnn_up4conv_head', | |
'NORM': 'None', | |
'NUM_CHANNEL': 256, | |
'PROPOSAL_MODE': 'Level', | |
'RESOLUTION_REQUIREMENT': 32}, | |
'FRCNN': {'BATCH_PER_IM': 512, | |
'BBOX_REG_WEIGHTS': [10.0, 10.0, 5.0, 5.0], | |
'FG_RATIO': 0.25, | |
'FG_THRESH': 0.5}, | |
'MODE_FPN': True, | |
'MODE_MASK': True, | |
'MRCNN': {'HEAD_DIM': 256}, | |
'PREPROC': {'MAX_SIZE': 1344.0, | |
'PIXEL_MEAN': [123.675, 116.28, 103.53], | |
'PIXEL_STD': [58.395, 57.12, 57.375], | |
'TEST_SHORT_EDGE_SIZE': 800, | |
'TRAIN_SHORT_EDGE_SIZE': [800, 800]}, | |
'RPN': {'ANCHOR_RATIOS': (0.5, 1.0, 2.0), | |
'ANCHOR_SIZES': (32, 64, 128, 256, 512), | |
'ANCHOR_STRIDE': 16, | |
'BATCH_PER_IM': 256, | |
'CROWD_OVERLAP_THRESH': 9.99, | |
'FG_RATIO': 0.5, | |
'HEAD_DIM': 1024, | |
'MIN_SIZE': 0.1, | |
'NEGATIVE_ANCHOR_THRESH': 0.3, | |
'NUM_ANCHOR': 15, | |
'POSITIVE_ANCHOR_THRESH': 0.7, | |
'PROPOSAL_NMS_THRESH': 0.7, | |
'TEST_PER_LEVEL_NMS_TOPK': 1000, | |
'TEST_POST_NMS_TOPK': 1000, | |
'TEST_PRE_NMS_TOPK': 6000, | |
'TRAIN_PER_LEVEL_NMS_TOPK': 2000, | |
'TRAIN_POST_NMS_TOPK': 2000, | |
'TRAIN_PRE_NMS_TOPK': 12000}, | |
'TEST': {'FRCNN_NMS_THRESH': 0.5, | |
'RESULTS_PER_IM': 100, | |
'RESULT_SCORE_THRESH': 0.05, | |
'RESULT_SCORE_THRESH_VIS': 0.3}, | |
'TRAIN': {'BASE_LR': 0.01, | |
'BATCH_SIZE_PER_GPU': 2, | |
'EVAL_PERIOD': 25, | |
'LR_SCHEDULE': [120000, 160000, 180000], | |
'NUM_GPUS': 8, | |
'STARTING_EPOCH': 1, | |
'STEPS_PER_EPOCH': 15000, | |
'WARMUP': 1000, | |
'WARMUP_INIT_LR': 0.0033000000000000004, | |
'WEIGHT_DECAY': 0.0001}, | |
'TRAINER': 'horovod'} | |
Batch size per GPU: 2 | |
[32m[0308 18:58:01 @train.py:571][0m Warm Up Schedule (steps, value): [(0, 0.0033000000000000004), (1000, 0.01)] | |
[32m[0308 18:58:01 @train.py:572][0m LR Schedule (epochs, value): [(0, 0.01), (8.0, 0.001), (10.0, 0.00010000000000000002)] | |
In train dataflow | |
[32m[0308 18:58:01 @config.py:268][0m Config: ------------------------------------------ | |
{'BACKBONE': {'FREEZE_AFFINE': False, | |
'FREEZE_AT': 2, | |
'NORM': 'FreezeBN', | |
'RESNET_NUM_BLOCKS': [3, 4, 6, 3], | |
'STRIDE_1X1': False, | |
'TF_PAD_MODE': False, | |
'WEIGHTS': '/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz'}, | |
'DATA': {'BASEDIR': '/home/ubuntu/data', | |
'CLASS_NAMES': ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', | |
'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', | |
'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', | |
'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', | |
'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', | |
'baseball bat', 'baseball glove', 'skateboard', 'surfboard', | |
'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', | |
'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', | |
'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', | |
'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', | |
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', | |
'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', | |
'hair drier', 'toothbrush'], | |
'NUM_CATEGORY': 80, | |
'NUM_CLASS': 81, | |
'TRAIN': ['train2017'], | |
'VAL': ('val2017',)}, | |
'FPN': {'ANCHOR_STRIDES': (4, 8, 16, 32, 64), | |
'FRCNN_CONV_HEAD_DIM': 256, | |
'FRCNN_FC_HEAD_DIM': 1024, | |
'FRCNN_HEAD_FUNC': 'fastrcnn_2fc_head', | |
'MRCNN_HEAD_FUNC': 'maskrcnn_up4conv_head', | |
'NORM': 'None', | |
'NUM_CHANNEL': 256, | |
'PROPOSAL_MODE': 'Level', | |
'RESOLUTION_REQUIREMENT': 32}, | |
'FRCNN': {'BATCH_PER_IM': 512, | |
'BBOX_REG_WEIGHTS': [10.0, 10.0, 5.0, 5.0], | |
'FG_RATIO': 0.25, | |
'FG_THRESH': 0.5}, | |
'MODE_FPN': True, | |
'MODE_MASK': True, | |
'MRCNN': {'HEAD_DIM': 256}, | |
'PREPROC': {'MAX_SIZE': 1344.0, | |
'PIXEL_MEAN': [123.675, 116.28, 103.53], | |
'PIXEL_STD': [58.395, 57.12, 57.375], | |
'TEST_SHORT_EDGE_SIZE': 800, | |
'TRAIN_SHORT_EDGE_SIZE': [800, 800]}, | |
'RPN': {'ANCHOR_RATIOS': (0.5, 1.0, 2.0), | |
'ANCHOR_SIZES': (32, 64, 128, 256, 512), | |
'ANCHOR_STRIDE': 16, | |
'BATCH_PER_IM': 256, | |
'CROWD_OVERLAP_THRESH': 9.99, | |
'FG_RATIO': 0.5, | |
'HEAD_DIM': 1024, | |
'MIN_SIZE': 0.1, | |
'NEGATIVE_ANCHOR_THRESH': 0.3, | |
'NUM_ANCHOR': 15, | |
'POSITIVE_ANCHOR_THRESH': 0.7, | |
'PROPOSAL_NMS_THRESH': 0.7, | |
'TEST_PER_LEVEL_NMS_TOPK': 1000, | |
'TEST_POST_NMS_TOPK': 1000, | |
'TEST_PRE_NMS_TOPK': 6000, | |
'TRAIN_PER_LEVEL_NMS_TOPK': 2000, | |
'TRAIN_POST_NMS_TOPK': 2000, | |
'TRAIN_PRE_NMS_TOPK': 12000}, | |
'TEST': {'FRCNN_NMS_THRESH': 0.5, | |
'RESULTS_PER_IM': 100, | |
'RESULT_SCORE_THRESH': 0.05, | |
'RESULT_SCORE_THRESH_VIS': 0.3}, | |
'TRAIN': {'BASE_LR': 0.01, | |
'BATCH_SIZE_PER_GPU': 2, | |
'EVAL_PERIOD': 25, | |
'LR_SCHEDULE': [120000, 160000, 180000], | |
'NUM_GPUS': 8, | |
'STARTING_EPOCH': 1, | |
'STEPS_PER_EPOCH': 15000, | |
'WARMUP': 1000, | |
'WARMUP_INIT_LR': 0.0033000000000000004, | |
'WEIGHT_DECAY': 0.0001}, | |
'TRAINER': 'horovod'} | |
Batch size per GPU: 2 | |
[32m[0308 18:58:01 @config.py:268][0m Config: ------------------------------------------ | |
{'BACKBONE': {'FREEZE_AFFINE': False, | |
'FREEZE_AT': 2, | |
'NORM': 'FreezeBN', | |
'RESNET_NUM_BLOCKS': [3, 4, 6, 3], | |
'STRIDE_1X1': False, | |
'TF_PAD_MODE': False, | |
'WEIGHTS': '/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz'}, | |
'DATA': {'BASEDIR': '/home/ubuntu/data', | |
'CLASS_NAMES': ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', | |
'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', | |
'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', | |
'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', | |
'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', | |
'baseball bat', 'baseball glove', 'skateboard', 'surfboard', | |
'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', | |
'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', | |
'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', | |
'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', | |
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', | |
'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', | |
'hair drier', 'toothbrush'], | |
'NUM_CATEGORY': 80, | |
'NUM_CLASS': 81, | |
'TRAIN': ['train2017'], | |
'VAL': ('val2017',)}, | |
'FPN': {'ANCHOR_STRIDES': (4, 8, 16, 32, 64), | |
'FRCNN_CONV_HEAD_DIM': 256, | |
'FRCNN_FC_HEAD_DIM': 1024, | |
'FRCNN_HEAD_FUNC': 'fastrcnn_2fc_head', | |
'MRCNN_HEAD_FUNC': 'maskrcnn_up4conv_head', | |
'NORM': 'None', | |
'NUM_CHANNEL': 256, | |
'PROPOSAL_MODE': 'Level', | |
'RESOLUTION_REQUIREMENT': 32}, | |
'FRCNN': {'BATCH_PER_IM': 512, | |
'BBOX_REG_WEIGHTS': [10.0, 10.0, 5.0, 5.0], | |
'FG_RATIO': 0.25, | |
'FG_THRESH': 0.5}, | |
'MODE_FPN': True, | |
'MODE_MASK': True, | |
'MRCNN': {'HEAD_DIM': 256}, | |
'PREPROC': {'MAX_SIZE': 1344.0, | |
'PIXEL_MEAN': [123.675, 116.28, 103.53], | |
'PIXEL_STD': [58.395, 57.12, 57.375], | |
'TEST_SHORT_EDGE_SIZE': 800, | |
'TRAIN_SHORT_EDGE_SIZE': [800, 800]}, | |
'RPN': {'ANCHOR_RATIOS': (0.5, 1.0, 2.0), | |
'ANCHOR_SIZES': (32, 64, 128, 256, 512), | |
'ANCHOR_STRIDE': 16, | |
'BATCH_PER_IM': 256, | |
'CROWD_OVERLAP_THRESH': 9.99, | |
'FG_RATIO': 0.5, | |
'HEAD_DIM': 1024, | |
'MIN_SIZE': 0.1, | |
'NEGATIVE_ANCHOR_THRESH': 0.3, | |
'NUM_ANCHOR': 15, | |
'POSITIVE_ANCHOR_THRESH': 0.7, | |
'PROPOSAL_NMS_THRESH': 0.7, | |
'TEST_PER_LEVEL_NMS_TOPK': 1000, | |
'TEST_POST_NMS_TOPK': 1000, | |
'TEST_PRE_NMS_TOPK': 6000, | |
'TRAIN_PER_LEVEL_NMS_TOPK': 2000, | |
'TRAIN_POST_NMS_TOPK': 2000, | |
'TRAIN_PRE_NMS_TOPK': 12000}, | |
'TEST': {'FRCNN_NMS_THRESH': 0.5, | |
'RESULTS_PER_IM': 100, | |
'RESULT_SCORE_THRESH': 0.05, | |
'RESULT_SCORE_THRESH_VIS': 0.3}, | |
'TRAIN': {'BASE_LR': 0.01, | |
'BATCH_SIZE_PER_GPU': 2, | |
'EVAL_PERIOD': 25, | |
'LR_SCHEDULE': [120000, 160000, 180000], | |
'NUM_GPUS': 8, | |
'STARTING_EPOCH': 1, | |
'STEPS_PER_EPOCH': 15000, | |
'WARMUP': 1000, | |
'WARMUP_INIT_LR': 0.0033000000000000004, | |
'WEIGHT_DECAY': 0.0001}, | |
'TRAINER': 'horovod'} | |
Batch size per GPU: 2 | |
[32m[0308 18:58:01 @config.py:268][0m Config: ------------------------------------------ | |
{'BACKBONE': {'FREEZE_AFFINE': False, | |
'FREEZE_AT': 2, | |
'NORM': 'FreezeBN', | |
'RESNET_NUM_BLOCKS': [3, 4, 6, 3], | |
'STRIDE_1X1': False, | |
'TF_PAD_MODE': False, | |
'WEIGHTS': '/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz'}, | |
'DATA': {'BASEDIR': '/home/ubuntu/data', | |
'CLASS_NAMES': ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', | |
'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', | |
'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', | |
'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', | |
'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', | |
'baseball bat', 'baseball glove', 'skateboard', 'surfboard', | |
'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', | |
'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', | |
'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', | |
'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', | |
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', | |
'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', | |
'hair drier', 'toothbrush'], | |
'NUM_CATEGORY': 80, | |
'NUM_CLASS': 81, | |
'TRAIN': ['train2017'], | |
'VAL': ('val2017',)}, | |
'FPN': {'ANCHOR_STRIDES': (4, 8, 16, 32, 64), | |
'FRCNN_CONV_HEAD_DIM': 256, | |
'FRCNN_FC_HEAD_DIM': 1024, | |
'FRCNN_HEAD_FUNC': 'fastrcnn_2fc_head', | |
'MRCNN_HEAD_FUNC': 'maskrcnn_up4conv_head', | |
'NORM': 'None', | |
'NUM_CHANNEL': 256, | |
'PROPOSAL_MODE': 'Level', | |
'RESOLUTION_REQUIREMENT': 32}, | |
'FRCNN': {'BATCH_PER_IM': 512, | |
'BBOX_REG_WEIGHTS': [10.0, 10.0, 5.0, 5.0], | |
'FG_RATIO': 0.25, | |
'FG_THRESH': 0.5}, | |
'MODE_FPN': True, | |
'MODE_MASK': True, | |
'MRCNN': {'HEAD_DIM': 256}, | |
'PREPROC': {'MAX_SIZE': 1344.0, | |
'PIXEL_MEAN': [123.675, 116.28, 103.53], | |
'PIXEL_STD': [58.395, 57.12, 57.375], | |
'TEST_SHORT_EDGE_SIZE': 800, | |
'TRAIN_SHORT_EDGE_SIZE': [800, 800]}, | |
'RPN': {'ANCHOR_RATIOS': (0.5, 1.0, 2.0), | |
'ANCHOR_SIZES': (32, 64, 128, 256, 512), | |
'ANCHOR_STRIDE': 16, | |
'BATCH_PER_IM': 256, | |
'CROWD_OVERLAP_THRESH': 9.99, | |
'FG_RATIO': 0.5, | |
'HEAD_DIM': 1024, | |
'MIN_SIZE': 0.1, | |
'NEGATIVE_ANCHOR_THRESH': 0.3, | |
'NUM_ANCHOR': 15, | |
'POSITIVE_ANCHOR_THRESH': 0.7, | |
'PROPOSAL_NMS_THRESH': 0.7, | |
'TEST_PER_LEVEL_NMS_TOPK': 1000, | |
'TEST_POST_NMS_TOPK': 1000, | |
'TEST_PRE_NMS_TOPK': 6000, | |
'TRAIN_PER_LEVEL_NMS_TOPK': 2000, | |
'TRAIN_POST_NMS_TOPK': 2000, | |
'TRAIN_PRE_NMS_TOPK': 12000}, | |
'TEST': {'FRCNN_NMS_THRESH': 0.5, | |
'RESULTS_PER_IM': 100, | |
'RESULT_SCORE_THRESH': 0.05, | |
'RESULT_SCORE_THRESH_VIS': 0.3}, | |
'TRAIN': {'BASE_LR': 0.01, | |
'BATCH_SIZE_PER_GPU': 2, | |
'EVAL_PERIOD': 25, | |
'LR_SCHEDULE': [120000, 160000, 180000], | |
'NUM_GPUS': 8, | |
'STARTING_EPOCH': 1, | |
'STEPS_PER_EPOCH': 15000, | |
'WARMUP': 1000, | |
'WARMUP_INIT_LR': 0.0033000000000000004, | |
'WEIGHT_DECAY': 0.0001}, | |
'TRAINER': 'horovod'} | |
[32m[0308 18:58:01 @train.py:571][0m Warm Up Schedule (steps, value): [(0, 0.0033000000000000004), (1000, 0.01)] | |
[32m[0308 18:58:01 @train.py:571][0m Warm Up Schedule (steps, value): [(0, 0.0033000000000000004), (1000, 0.01)] | |
Batch size per GPU: 2 | |
[32m[0308 18:58:01 @train.py:571][0m Warm Up Schedule (steps, value): [(0, 0.0033000000000000004), (1000, 0.01)] | |
[32m[0308 18:58:01 @train.py:572][0m LR Schedule (epochs, value): [(0, 0.01), (8.0, 0.001), (10.0, 0.00010000000000000002)] | |
In train dataflow | |
[32m[0308 18:58:01 @train.py:572][0m LR Schedule (epochs, value): [(0, 0.01), (8.0, 0.001), (10.0, 0.00010000000000000002)] | |
In train dataflow | |
[32m[0308 18:58:01 @train.py:572][0m LR Schedule (epochs, value): [(0, 0.01), (8.0, 0.001), (10.0, 0.00010000000000000002)] | |
In train dataflow | |
loading annotations into memory... | |
[32m[0308 18:58:01 @config.py:268][0m Config: ------------------------------------------ | |
{'BACKBONE': {'FREEZE_AFFINE': False, | |
'FREEZE_AT': 2, | |
'NORM': 'FreezeBN', | |
'RESNET_NUM_BLOCKS': [3, 4, 6, 3], | |
'STRIDE_1X1': False, | |
'TF_PAD_MODE': False, | |
'WEIGHTS': '/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz'}, | |
'DATA': {'BASEDIR': '/home/ubuntu/data', | |
'CLASS_NAMES': ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', | |
'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', | |
'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', | |
'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', | |
'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', | |
'baseball bat', 'baseball glove', 'skateboard', 'surfboard', | |
'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', | |
'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', | |
'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', | |
'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', | |
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', | |
'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', | |
'hair drier', 'toothbrush'], | |
'NUM_CATEGORY': 80, | |
'NUM_CLASS': 81, | |
'TRAIN': ['train2017'], | |
'VAL': ('val2017',)}, | |
'FPN': {'ANCHOR_STRIDES': (4, 8, 16, 32, 64), | |
'FRCNN_CONV_HEAD_DIM': 256, | |
'FRCNN_FC_HEAD_DIM': 1024, | |
'FRCNN_HEAD_FUNC': 'fastrcnn_2fc_head', | |
'MRCNN_HEAD_FUNC': 'maskrcnn_up4conv_head', | |
'NORM': 'None', | |
'NUM_CHANNEL': 256, | |
'PROPOSAL_MODE': 'Level', | |
'RESOLUTION_REQUIREMENT': 32}, | |
'FRCNN': {'BATCH_PER_IM': 512, | |
'BBOX_REG_WEIGHTS': [10.0, 10.0, 5.0, 5.0], | |
'FG_RATIO': 0.25, | |
'FG_THRESH': 0.5}, | |
'MODE_FPN': True, | |
'MODE_MASK': True, | |
'MRCNN': {'HEAD_DIM': 256}, | |
'PREPROC': {'MAX_SIZE': 1344.0, | |
'PIXEL_MEAN': [123.675, 116.28, 103.53], | |
'PIXEL_STD': [58.395, 57.12, 57.375], | |
'TEST_SHORT_EDGE_SIZE': 800, | |
'TRAIN_SHORT_EDGE_SIZE': [800, 800]}, | |
'RPN': {'ANCHOR_RATIOS': (0.5, 1.0, 2.0), | |
'ANCHOR_SIZES': (32, 64, 128, 256, 512), | |
'ANCHOR_STRIDE': 16, | |
'BATCH_PER_IM': 256, | |
'CROWD_OVERLAP_THRESH': 9.99, | |
'FG_RATIO': 0.5, | |
'HEAD_DIM': 1024, | |
'MIN_SIZE': 0.1, | |
'NEGATIVE_ANCHOR_THRESH': 0.3, | |
'NUM_ANCHOR': 15, | |
'POSITIVE_ANCHOR_THRESH': 0.7, | |
'PROPOSAL_NMS_THRESH': 0.7, | |
'TEST_PER_LEVEL_NMS_TOPK': 1000, | |
'TEST_POST_NMS_TOPK': 1000, | |
'TEST_PRE_NMS_TOPK': 6000, | |
'TRAIN_PER_LEVEL_NMS_TOPK': 2000, | |
'TRAIN_POST_NMS_TOPK': 2000, | |
'TRAIN_PRE_NMS_TOPK': 12000}, | |
'TEST': {'FRCNN_NMS_THRESH': 0.5, | |
'RESULTS_PER_IM': 100, | |
'RESULT_SCORE_THRESH': 0.05, | |
'RESULT_SCORE_THRESH_VIS': 0.3}, | |
'TRAIN': {'BASE_LR': 0.01, | |
'BATCH_SIZE_PER_GPU': 2, | |
'EVAL_PERIOD': 25, | |
'LR_SCHEDULE': [120000, 160000, 180000], | |
'NUM_GPUS': 8, | |
'STARTING_EPOCH': 1, | |
'STEPS_PER_EPOCH': 15000, | |
'WARMUP': 1000, | |
'WARMUP_INIT_LR': 0.0033000000000000004, | |
'WEIGHT_DECAY': 0.0001}, | |
'TRAINER': 'horovod'} | |
[32m[0308 18:58:01 @config.py:268][0m Config: ------------------------------------------ | |
{'BACKBONE': {'FREEZE_AFFINE': False, | |
'FREEZE_AT': 2, | |
'NORM': 'FreezeBN', | |
'RESNET_NUM_BLOCKS': [3, 4, 6, 3], | |
'STRIDE_1X1': False, | |
'TF_PAD_MODE': False, | |
'WEIGHTS': '/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz'}, | |
'DATA': {'BASEDIR': '/home/ubuntu/data', | |
'CLASS_NAMES': ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', | |
'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', | |
'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', | |
'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', | |
'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', | |
'baseball bat', 'baseball glove', 'skateboard', 'surfboard', | |
'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', | |
'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', | |
'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', | |
'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', | |
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', | |
'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', | |
'hair drier', 'toothbrush'], | |
'NUM_CATEGORY': 80, | |
'NUM_CLASS': 81, | |
'TRAIN': ['train2017'], | |
'VAL': ('val2017',)}, | |
'FPN': {'ANCHOR_STRIDES': (4, 8, 16, 32, 64), | |
'FRCNN_CONV_HEAD_DIM': 256, | |
'FRCNN_FC_HEAD_DIM': 1024, | |
'FRCNN_HEAD_FUNC': 'fastrcnn_2fc_head', | |
'MRCNN_HEAD_FUNC': 'maskrcnn_up4conv_head', | |
'NORM': 'None', | |
'NUM_CHANNEL': 256, | |
'PROPOSAL_MODE': 'Level', | |
'RESOLUTION_REQUIREMENT': 32}, | |
'FRCNN': {'BATCH_PER_IM': 512, | |
'BBOX_REG_WEIGHTS': [10.0, 10.0, 5.0, 5.0], | |
'FG_RATIO': 0.25, | |
'FG_THRESH': 0.5}, | |
'MODE_FPN': True, | |
'MODE_MASK': True, | |
'MRCNN': {'HEAD_DIM': 256}, | |
'PREPROC': {'MAX_SIZE': 1344.0, | |
'PIXEL_MEAN': [123.675, 116.28, 103.53], | |
'PIXEL_STD': [58.395, 57.12, 57.375], | |
'TEST_SHORT_EDGE_SIZE': 800, | |
'TRAIN_SHORT_EDGE_SIZE': [800, 800]}, | |
'RPN': {'ANCHOR_RATIOS': (0.5, 1.0, 2.0), | |
'ANCHOR_SIZES': (32, 64, 128, 256, 512), | |
'ANCHOR_STRIDE': 16, | |
'BATCH_PER_IM': 256, | |
'CROWD_OVERLAP_THRESH': 9.99, | |
'FG_RATIO': 0.5, | |
'HEAD_DIM': 1024, | |
'MIN_SIZE': 0.1, | |
'NEGATIVE_ANCHOR_THRESH': 0.3, | |
'NUM_ANCHOR': 15, | |
'POSITIVE_ANCHOR_THRESH': 0.7, | |
'PROPOSAL_NMS_THRESH': 0.7, | |
'TEST_PER_LEVEL_NMS_TOPK': 1000, | |
'TEST_POST_NMS_TOPK': 1000, | |
'TEST_PRE_NMS_TOPK': 6000, | |
'TRAIN_PER_LEVEL_NMS_TOPK': 2000, | |
'TRAIN_POST_NMS_TOPK': 2000, | |
'TRAIN_PRE_NMS_TOPK': 12000}, | |
'TEST': {'FRCNN_NMS_THRESH': 0.5, | |
'RESULTS_PER_IM': 100, | |
'RESULT_SCORE_THRESH': 0.05, | |
'RESULT_SCORE_THRESH_VIS': 0.3}, | |
'TRAIN': {'BASE_LR': 0.01, | |
'BATCH_SIZE_PER_GPU': 2, | |
'EVAL_PERIOD': 25, | |
'LR_SCHEDULE': [120000, 160000, 180000], | |
'NUM_GPUS': 8, | |
'STARTING_EPOCH': 1, | |
'STEPS_PER_EPOCH': 15000, | |
'WARMUP': 1000, | |
'WARMUP_INIT_LR': 0.0033000000000000004, | |
'WEIGHT_DECAY': 0.0001}, | |
'TRAINER': 'horovod'} | |
[32m[0308 18:58:01 @config.py:268][0m Config: ------------------------------------------ | |
{'BACKBONE': {'FREEZE_AFFINE': False, | |
'FREEZE_AT': 2, | |
'NORM': 'FreezeBN', | |
'RESNET_NUM_BLOCKS': [3, 4, 6, 3], | |
'STRIDE_1X1': False, | |
'TF_PAD_MODE': False, | |
'WEIGHTS': '/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz'}, | |
'DATA': {'BASEDIR': '/home/ubuntu/data', | |
'CLASS_NAMES': ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', | |
'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', | |
'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', | |
'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', | |
'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', | |
'baseball bat', 'baseball glove', 'skateboard', 'surfboard', | |
'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', | |
'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', | |
'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', | |
'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', | |
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', | |
'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', | |
'hair drier', 'toothbrush'], | |
'NUM_CATEGORY': 80, | |
'NUM_CLASS': 81, | |
'TRAIN': ['train2017'], | |
'VAL': ('val2017',)}, | |
'FPN': {'ANCHOR_STRIDES': (4, 8, 16, 32, 64), | |
'FRCNN_CONV_HEAD_DIM': 256, | |
'FRCNN_FC_HEAD_DIM': 1024, | |
'FRCNN_HEAD_FUNC': 'fastrcnn_2fc_head', | |
'MRCNN_HEAD_FUNC': 'maskrcnn_up4conv_head', | |
'NORM': 'None', | |
'NUM_CHANNEL': 256, | |
'PROPOSAL_MODE': 'Level', | |
'RESOLUTION_REQUIREMENT': 32}, | |
'FRCNN': {'BATCH_PER_IM': 512, | |
'BBOX_REG_WEIGHTS': [10.0, 10.0, 5.0, 5.0], | |
'FG_RATIO': 0.25, | |
'FG_THRESH': 0.5}, | |
'MODE_FPN': True, | |
'MODE_MASK': True, | |
'MRCNN': {'HEAD_DIM': 256}, | |
'PREPROC': {'MAX_SIZE': 1344.0, | |
'PIXEL_MEAN': [123.675, 116.28, 103.53], | |
'PIXEL_STD': [58.395, 57.12, 57.375], | |
'TEST_SHORT_EDGE_SIZE': 800, | |
'TRAIN_SHORT_EDGE_SIZE': [800, 800]}, | |
'RPN': {'ANCHOR_RATIOS': (0.5, 1.0, 2.0), | |
'ANCHOR_SIZES': (32, 64, 128, 256, 512), | |
'ANCHOR_STRIDE': 16, | |
'BATCH_PER_IM': 256, | |
'CROWD_OVERLAP_THRESH': 9.99, | |
'FG_RATIO': 0.5, | |
'HEAD_DIM': 1024, | |
'MIN_SIZE': 0.1, | |
'NEGATIVE_ANCHOR_THRESH': 0.3, | |
'NUM_ANCHOR': 15, | |
'POSITIVE_ANCHOR_THRESH': 0.7, | |
'PROPOSAL_NMS_THRESH': 0.7, | |
'TEST_PER_LEVEL_NMS_TOPK': 1000, | |
'TEST_POST_NMS_TOPK': 1000, | |
'TEST_PRE_NMS_TOPK': 6000, | |
'TRAIN_PER_LEVEL_NMS_TOPK': 2000, | |
'TRAIN_POST_NMS_TOPK': 2000, | |
'TRAIN_PRE_NMS_TOPK': 12000}, | |
'TEST': {'FRCNN_NMS_THRESH': 0.5, | |
'RESULTS_PER_IM': 100, | |
'RESULT_SCORE_THRESH': 0.05, | |
'RESULT_SCORE_THRESH_VIS': 0.3}, | |
'TRAIN': {'BASE_LR': 0.01, | |
'BATCH_SIZE_PER_GPU': 2, | |
'EVAL_PERIOD': 25, | |
'LR_SCHEDULE': [120000, 160000, 180000], | |
'NUM_GPUS': 8, | |
'STARTING_EPOCH': 1, | |
'STEPS_PER_EPOCH': 15000, | |
'WARMUP': 1000, | |
'WARMUP_INIT_LR': 0.0033000000000000004, | |
'WEIGHT_DECAY': 0.0001}, | |
'TRAINER': 'horovod'} | |
Batch size per GPU: 2 | |
Batch size per GPU: 2 | |
Batch size per GPU: 2 | |
[32m[0308 18:58:01 @train.py:571][0m Warm Up Schedule (steps, value): [(0, 0.0033000000000000004), (1000, 0.01)] | |
[32m[0308 18:58:01 @train.py:571][0m Warm Up Schedule (steps, value): [(0, 0.0033000000000000004), (1000, 0.01)] | |
loading annotations into memory... | |
loading annotations into memory... | |
[32m[0308 18:58:01 @train.py:571][0m Warm Up Schedule (steps, value): [(0, 0.0033000000000000004), (1000, 0.01)] | |
[32m[0308 18:58:01 @train.py:572][0m LR Schedule (epochs, value): [(0, 0.01), (8.0, 0.001), (10.0, 0.00010000000000000002)] | |
In train dataflow | |
[32m[0308 18:58:01 @train.py:572][0m LR Schedule (epochs, value): [(0, 0.01), (8.0, 0.001), (10.0, 0.00010000000000000002)] | |
In train dataflow | |
loading annotations into memory... | |
[32m[0308 18:58:01 @train.py:572][0m LR Schedule (epochs, value): [(0, 0.01), (8.0, 0.001), (10.0, 0.00010000000000000002)] | |
In train dataflow | |
[32m[0308 18:58:01 @config.py:268][0m Config: ------------------------------------------ | |
{'BACKBONE': {'FREEZE_AFFINE': False, | |
'FREEZE_AT': 2, | |
'NORM': 'FreezeBN', | |
'RESNET_NUM_BLOCKS': [3, 4, 6, 3], | |
'STRIDE_1X1': False, | |
'TF_PAD_MODE': False, | |
'WEIGHTS': '/home/ubuntu/data/pretrained-models/ImageNet-R50-AlignPadding.npz'}, | |
'DATA': {'BASEDIR': '/home/ubuntu/data', | |
'CLASS_NAMES': ['BG', 'person', 'bicycle', 'car', 'motorcycle', 'airplane', 'bus', | |
'train', 'truck', 'boat', 'traffic light', 'fire hydrant', 'stop sign', | |
'parking meter', 'bench', 'bird', 'cat', 'dog', 'horse', 'sheep', 'cow', | |
'elephant', 'bear', 'zebra', 'giraffe', 'backpack', 'umbrella', 'handbag', | |
'tie', 'suitcase', 'frisbee', 'skis', 'snowboard', 'sports ball', 'kite', | |
'baseball bat', 'baseball glove', 'skateboard', 'surfboard', | |
'tennis racket', 'bottle', 'wine glass', 'cup', 'fork', 'knife', 'spoon', | |
'bowl', 'banana', 'apple', 'sandwich', 'orange', 'broccoli', 'carrot', | |
'hot dog', 'pizza', 'donut', 'cake', 'chair', 'couch', 'potted plant', | |
'bed', 'dining table', 'toilet', 'tv', 'laptop', 'mouse', 'remote', | |
'keyboard', 'cell phone', 'microwave', 'oven', 'toaster', 'sink', | |
'refrigerator', 'book', 'clock', 'vase', 'scissors', 'teddy bear', | |
'hair drier', 'toothbrush'], | |
'NUM_CATEGORY': 80, | |
'NUM_CLASS': 81, | |
'TRAIN': ['train2017'], | |
'VAL': ('val2017',)}, | |
'FPN': {'ANCHOR_STRIDES': (4, 8, 16, 32, 64), | |
'FRCNN_CONV_HEAD_DIM': 256, | |
'FRCNN_FC_HEAD_DIM': 1024, | |
'FRCNN_HEAD_FUNC': 'fastrcnn_2fc_head', | |
'MRCNN_HEAD_FUNC': 'maskrcnn_up4conv_head', | |
'NORM': 'None', | |
'NUM_CHANNEL': 256, | |
'PROPOSAL_MODE': 'Level', | |
'RESOLUTION_REQUIREMENT': 32}, | |
'FRCNN': {'BATCH_PER_IM': 512, | |
'BBOX_REG_WEIGHTS': [10.0, 10.0, 5.0, 5.0], | |
'FG_RATIO': 0.25, | |
'FG_THRESH': 0.5}, | |
'MODE_FPN': True, | |
'MODE_MASK': True, | |
'MRCNN': {'HEAD_DIM': 256}, | |
'PREPROC': {'MAX_SIZE': 1344.0, | |
'PIXEL_MEAN': [123.675, 116.28, 103.53], | |
'PIXEL_STD': [58.395, 57.12, 57.375], | |
'TEST_SHORT_EDGE_SIZE': 800, | |
'TRAIN_SHORT_EDGE_SIZE': [800, 800]}, | |
'RPN': {'ANCHOR_RATIOS': (0.5, 1.0, 2.0), | |
'ANCHOR_SIZES': (32, 64, 128, 256, 512), | |
'ANCHOR_STRIDE': 16, | |
'BATCH_PER_IM': 256, | |
'CROWD_OVERLAP_THRESH': 9.99, | |
'FG_RATIO': 0.5, | |
'HEAD_DIM': 1024, | |
'MIN_SIZE': 0.1, | |
'NEGATIVE_ANCHOR_THRESH': 0.3, | |
'NUM_ANCHOR': 15, | |
'POSITIVE_ANCHOR_THRESH': 0.7, | |
'PROPOSAL_NMS_THRESH': 0.7, | |
'TEST_PER_LEVEL_NMS_TOPK': 1000, | |
'TEST_POST_NMS_TOPK': 1000, | |
'TEST_PRE_NMS_TOPK': 6000, | |
'TRAIN_PER_LEVEL_NMS_TOPK': 2000, | |
'TRAIN_POST_NMS_TOPK': 2000, | |
'TRAIN_PRE_NMS_TOPK': 12000}, | |
'TEST': {'FRCNN_NMS_THRESH': 0.5, | |
'RESULTS_PER_IM': 100, | |
'RESULT_SCORE_THRESH': 0.05, | |
'RESULT_SCORE_THRESH_VIS': 0.3}, | |
'TRAIN': {'BASE_LR': 0.01, | |
'BATCH_SIZE_PER_GPU': 2, | |
'EVAL_PERIOD': 25, | |
'LR_SCHEDULE': [120000, 160000, 180000], | |
'NUM_GPUS': 8, | |
'STARTING_EPOCH': 1, | |
'STEPS_PER_EPOCH': 15000, | |
'WARMUP': 1000, | |
'WARMUP_INIT_LR': 0.0033000000000000004, | |
'WEIGHT_DECAY': 0.0001}, | |
'TRAINER': 'horovod'} | |
Batch size per GPU: 2 | |
[32m[0308 18:58:01 @train.py:571][0m Warm Up Schedule (steps, value): [(0, 0.0033000000000000004), (1000, 0.01)] | |
[32m[0308 18:58:01 @train.py:572][0m LR Schedule (epochs, value): [(0, 0.01), (8.0, 0.001), (10.0, 0.00010000000000000002)] | |
In train dataflow | |
loading annotations into memory... | |
loading annotations into memory... | |
loading annotations into memory... | |
loading annotations into memory... | |
Done (t=13.34s) | |
creating index... | |
Done (t=13.60s) | |
creating index... | |
Done (t=13.64s) | |
creating index... | |
Done (t=13.72s) | |
creating index... | |
Done (t=13.78s) | |
creating index... | |
index created! | |
[32m[0308 18:58:15 @dataset.py:50][0m Instances loaded from /home/ubuntu/data/annotations/instances_train2017.json. | |
0%| | 0/118287 [00:00<?, ?it/s] | |
0%| | 483/118287 [00:00<00:24, 4809.98it/s] | |
1%| | 902/118287 [00:00<00:25, 4604.98it/s]index created! | |
[32m[0308 18:58:16 @dataset.py:50][0m Instances loaded from /home/ubuntu/data/annotations/instances_train2017.json. | |
Done (t=14.41s) | |
creating index... | |
index created! | |
[32m[0308 18:58:16 @dataset.py:50][0m Instances loaded from /home/ubuntu/data/annotations/instances_train2017.json. | |
1%| | 1370/118287 [00:00<00:25, 4618.86it/s] | |
0%| | 0/118287 [00:00<?, ?it/s]index created! | |
[32m[0308 18:58:16 @dataset.py:50][0m Instances loaded from /home/ubuntu/data/annotations/instances_train2017.json. | |
0%| | 0/118287 [00:00<?, ?it/s]index created! | |
[32m[0308 18:58:16 @dataset.py:50][0m Instances loaded from /home/ubuntu/data/annotations/instances_train2017.json. | |
Done (t=14.53s) | |
creating index... | |
2%|▏ | 1838/118287 [00:00<00:25, 4636.55it/s] | |
0%| | 471/118287 [00:00<00:25, 4708.25it/s] | |
0%| | 0/118287 [00:00<?, ?it/s] | |
0%| | 0/118287 [00:00<?, ?it/s] | |
0%| | 464/118287 [00:00<00:25, 4634.07it/s] | |
2%|▏ | 2305/118287 [00:00<00:25, 4636.69it/s] | |
1%| | 879/118287 [00:00<00:26, 4497.05it/s] | |
0%| | 480/118287 [00:00<00:24, 4797.65it/s] | |
0%| | 486/118287 [00:00<00:24, 4854.97it/s] | |
1%| | 877/118287 [00:00<00:26, 4467.34it/s] | |
2%|▏ | 2798/118287 [00:00<00:24, 4716.95it/s] | |
1%| | 1354/118287 [00:00<00:25, 4568.42it/s] | |
1%| | 894/118287 [00:00<00:25, 4577.96it/s] | |
1%| | 905/118287 [00:00<00:25, 4627.62it/s] | |
1%| | 1344/118287 [00:00<00:25, 4526.02it/s] | |
3%|▎ | 3274/118287 [00:00<00:24, 4729.37it/s] | |
2%|▏ | 1831/118287 [00:00<00:25, 4620.58it/s] | |
1%| | 1372/118287 [00:00<00:25, 4634.07it/s] | |
1%| | 1370/118287 [00:00<00:25, 4622.80it/s] | |
2%|▏ | 1814/118287 [00:00<00:25, 4576.50it/s] | |
3%|▎ | 3717/118287 [00:00<00:24, 4631.59it/s] | |
2%|▏ | 2290/118287 [00:00<00:25, 4608.67it/s] | |
2%|▏ | 1848/118287 [00:00<00:24, 4671.01it/s] | |
2%|▏ | 1838/118287 [00:00<00:25, 4639.18it/s] | |
2%|▏ | 2279/118287 [00:00<00:25, 4597.67it/s] | |
4%|▎ | 4175/118287 [00:00<00:24, 4613.81it/s] | |
2%|▏ | 2786/118287 [00:00<00:24, 4708.21it/s] | |
2%|▏ | 2324/118287 [00:00<00:24, 4696.54it/s] | |
2%|▏ | 2305/118287 [00:00<00:24, 4643.19it/s] | |
2%|▏ | 2766/118287 [00:00<00:24, 4675.36it/s] | |
4%|▍ | 4618/118287 [00:01<00:25, 4499.36it/s] | |
3%|▎ | 3261/118287 [00:00<00:24, 4720.12it/s] | |
2%|▏ | 2824/118287 [00:00<00:24, 4782.10it/s]index created! | |
[32m[0308 18:58:17 @dataset.py:50][0m Instances loaded from /home/ubuntu/data/annotations/instances_train2017.json. | |
2%|▏ | 2789/118287 [00:00<00:24, 4698.61it/s] | |
3%|▎ | 3241/118287 [00:00<00:24, 4697.48it/s] | |
0%| | 0/118287 [00:00<?, ?it/s] | |
4%|▍ | 5093/118287 [00:01<00:24, 4569.52it/s]index created! | |
[32m[0308 18:58:17 @dataset.py:50][0m Instances loaded from /home/ubuntu/data/annotations/instances_train2017.json. | |
3%|▎ | 3696/118287 [00:00<00:25, 4566.33it/s] | |
3%|▎ | 3302/118287 [00:00<00:24, 4781.17it/s] | |
3%|▎ | 3264/118287 [00:00<00:24, 4709.56it/s] | |
3%|▎ | 3689/118287 [00:00<00:24, 4628.73it/s] | |
0%| | 0/118287 [00:00<?, ?it/s] | |
0%| | 403/118287 [00:00<00:29, 4026.07it/s] | |
5%|▍ | 5581/118287 [00:01<00:24, 4658.33it/s] | |
4%|▎ | 4153/118287 [00:00<00:24, 4567.25it/s] | |
3%|▎ | 3754/118287 [00:00<00:24, 4698.82it/s] | |
3%|▎ | 3705/118287 [00:00<00:24, 4614.69it/s] | |
4%|▎ | 4156/118287 [00:00<00:24, 4638.77it/s] | |
0%| | 433/118287 [00:00<00:27, 4329.75it/s] | |
1%| | 760/118287 [00:00<00:30, 3876.02it/s] | |
5%|▌ | 6111/118287 [00:01<00:23, 4833.10it/s] | |
4%|▎ | 4226/118287 [00:00<00:24, 4699.79it/s] | |
4%|▍ | 4592/118287 [00:01<00:25, 4421.30it/s] | |
4%|▎ | 4172/118287 [00:00<00:24, 4617.99it/s] | |
4%|▍ | 4602/118287 [00:01<00:25, 4454.00it/s] | |
1%| | 816/118287 [00:00<00:28, 4164.08it/s] | |
1%| | 1129/118287 [00:00<00:30, 3815.82it/s] | |
6%|▌ | 6627/118287 [00:01<00:22, 4924.21it/s] | |
4%|▍ | 4678/118287 [00:01<00:24, 4602.66it/s] | |
4%|▍ | 5053/118287 [00:01<00:25, 4474.17it/s] | |
4%|▍ | 4615/118287 [00:01<00:25, 4478.98it/s] | |
4%|▍ | 5073/118287 [00:01<00:25, 4525.70it/s] | |
1%|▏ | 1534/118287 [00:00<00:30, 3883.04it/s] | |
1%| | 1228/118287 [00:00<00:28, 4144.14it/s] | |
6%|▌ | 7119/118287 [00:01<00:22, 4868.68it/s] | |
4%|▍ | 5134/118287 [00:01<00:24, 4587.85it/s] | |
5%|▍ | 5541/118287 [00:01<00:24, 4588.24it/s] | |
4%|▍ | 5094/118287 [00:01<00:24, 4567.23it/s] | |
5%|▍ | 5529/118287 [00:01<00:24, 4526.25it/s] | |
2%|▏ | 1932/118287 [00:00<00:29, 3909.97it/s] | |
1%|▏ | 1682/118287 [00:00<00:27, 4251.77it/s] | |
6%|▋ | 7606/118287 [00:01<00:23, 4803.04it/s] | |
5%|▍ | 5624/118287 [00:01<00:24, 4675.31it/s] | |
5%|▌ | 6041/118287 [00:01<00:23, 4703.37it/s] | |
5%|▍ | 5581/118287 [00:01<00:24, 4649.92it/s] | |
5%|▌ | 5983/118287 [00:01<00:24, 4528.50it/s] | |
2%|▏ | 2332/118287 [00:00<00:29, 3935.98it/s] | |
2%|▏ | 2106/118287 [00:00<00:27, 4246.72it/s] | |
7%|▋ | 8087/118287 [00:01<00:23, 4618.36it/s] | |
5%|▌ | 6141/118287 [00:01<00:23, 4812.06it/s] | |
6%|▌ | 6553/118287 [00:01<00:23, 4818.97it/s] | |
5%|▌ | 6111/118287 [00:01<00:23, 4827.31it/s] | |
5%|▌ | 6456/118287 [00:01<00:24, 4586.58it/s] | |
2%|▏ | 2743/118287 [00:00<00:28, 3984.36it/s] | |
2%|▏ | 2555/118287 [00:00<00:26, 4311.52it/s] | |
6%|▌ | 6656/118287 [00:01<00:22, 4908.29it/s] | |
6%|▌ | 7034/118287 [00:01<00:23, 4795.44it/s] | |
7%|▋ | 8551/118287 [00:01<00:25, 4374.01it/s] | |
6%|▌ | 6628/118287 [00:01<00:22, 4921.60it/s] | |
6%|▌ | 6913/118287 [00:01<00:24, 4579.50it/s] | |
3%|▎ | 3161/118287 [00:00<00:28, 4039.29it/s] | |
3%|▎ | 3023/118287 [00:00<00:26, 4413.82it/s] | |
6%|▌ | 7146/118287 [00:01<00:23, 4818.83it/s] | |
6%|▋ | 7513/118287 [00:01<00:23, 4712.21it/s] | |
8%|▊ | 8993/118287 [00:01<00:25, 4273.67it/s] | |
6%|▌ | 7120/118287 [00:01<00:22, 4885.47it/s] | |
6%|▌ | 7370/118287 [00:01<00:24, 4444.02it/s] | |
3%|▎ | 3465/118287 [00:00<00:26, 4412.00it/s] | |
3%|▎ | 3543/118287 [00:00<00:29, 3901.38it/s] | |
6%|▋ | 7628/118287 [00:01<00:23, 4784.90it/s] | |
7%|▋ | 7985/118287 [00:01<00:23, 4653.63it/s] | |
8%|▊ | 9424/118287 [00:02<00:25, 4242.18it/s] | |
6%|▋ | 7608/118287 [00:01<00:23, 4809.49it/s] | |
7%|▋ | 7815/118287 [00:01<00:25, 4386.51it/s] | |
3%|▎ | 3884/118287 [00:00<00:26, 4339.64it/s] | |
3%|▎ | 3920/118287 [00:01<00:29, 3859.33it/s] | |
7%|▋ | 8106/118287 [00:01<00:23, 4678.78it/s] | |
7%|▋ | 8451/118287 [00:01<00:24, 4538.13it/s] | |
8%|▊ | 9852/118287 [00:02<00:25, 4248.29it/s] | |
7%|▋ | 8089/118287 [00:01<00:23, 4698.54it/s] | |
7%|▋ | 8254/118287 [00:01<00:25, 4293.84it/s] | |
4%|▎ | 4302/118287 [00:01<00:26, 4251.29it/s] | |
4%|▎ | 4297/118287 [00:01<00:29, 3828.69it/s] | |
7%|▋ | 8575/118287 [00:01<00:23, 4619.82it/s] | |
8%|▊ | 8906/118287 [00:01<00:24, 4506.42it/s] | |
9%|▊ | 10279/118287 [00:02<00:26, 4118.04it/s] | |
7%|▋ | 8560/118287 [00:01<00:23, 4633.06it/s] | |
7%|▋ | 8684/118287 [00:01<00:25, 4217.79it/s] | |
4%|▍ | 4718/118287 [00:01<00:26, 4219.11it/s] | |
4%|▍ | 4673/118287 [00:01<00:29, 3789.28it/s] | |
8%|▊ | 9038/118287 [00:01<00:23, 4599.17it/s] | |
8%|▊ | 9389/118287 [00:02<00:23, 4598.69it/s] | |
9%|▉ | 10693/118287 [00:02<00:26, 4065.67it/s] | |
8%|▊ | 9026/118287 [00:01<00:23, 4639.00it/s] | |
8%|▊ | 9107/118287 [00:02<00:26, 4149.97it/s] | |
4%|▍ | 5140/118287 [00:01<00:26, 4218.79it/s] | |
4%|▍ | 5072/118287 [00:01<00:29, 3846.75it/s] | |
8%|▊ | 9547/118287 [00:02<00:22, 4734.99it/s] | |
8%|▊ | 9874/118287 [00:02<00:23, 4671.16it/s] | |
8%|▊ | 9536/118287 [00:02<00:22, 4767.84it/s] | |
9%|▉ | 11102/118287 [00:02<00:26, 3987.89it/s] | |
8%|▊ | 9573/118287 [00:02<00:25, 4288.36it/s] | |
5%|▍ | 5558/118287 [00:01<00:26, 4201.65it/s] | |
5%|▍ | 5488/118287 [00:01<00:28, 3935.07it/s] | |
8%|▊ | 10022/118287 [00:02<00:23, 4691.42it/s] | |
9%|▊ | 10342/118287 [00:02<00:23, 4557.63it/s] | |
8%|▊ | 10014/118287 [00:02<00:22, 4724.17it/s] | |
10%|▉ | 11503/118287 [00:02<00:27, 3932.88it/s] | |
8%|▊ | 10004/118287 [00:02<00:25, 4179.96it/s] | |
5%|▌ | 5991/118287 [00:01<00:26, 4239.28it/s] | |
5%|▍ | 5900/118287 [00:01<00:28, 3979.66it/s] | |
9%|▉ | 10493/118287 [00:02<00:23, 4652.48it/s] | |
9%|▉ | 10799/118287 [00:02<00:23, 4528.60it/s] | |
10%|█ | 11919/118287 [00:02<00:26, 3994.45it/s] | |
9%|▉ | 10488/118287 [00:02<00:23, 4628.23it/s] | |
5%|▌ | 6442/118287 [00:01<00:25, 4314.29it/s] | |
9%|▉ | 10424/118287 [00:02<00:26, 4128.57it/s] | |
5%|▌ | 6356/118287 [00:01<00:27, 4135.74it/s] | |
9%|▉ | 10959/118287 [00:02<00:23, 4626.24it/s] | |
10%|▉ | 11253/118287 [00:02<00:24, 4445.52it/s] | |
10%|█ | 12345/118287 [00:02<00:26, 4067.00it/s] | |
9%|▉ | 10952/118287 [00:02<00:23, 4570.80it/s] | |
6%|▌ | 6872/118287 [00:01<00:26, 4261.65it/s] | |
6%|▌ | 6771/118287 [00:01<00:27, 4128.87it/s] | |
9%|▉ | 10839/118287 [00:02<00:26, 4066.88it/s] | |
10%|▉ | 11423/118287 [00:02<00:23, 4543.24it/s] | |
10%|▉ | 11705/118287 [00:02<00:23, 4466.89it/s] | |
11%|█ | 12753/118287 [00:02<00:25, 4063.12it/s] | |
10%|▉ | 11410/118287 [00:02<00:24, 4450.56it/s] | |
6%|▌ | 7185/118287 [00:01<00:27, 4064.24it/s] | |
6%|▌ | 7298/118287 [00:01<00:26, 4135.29it/s] | |
10%|▉ | 11247/118287 [00:02<00:26, 3982.40it/s] | |
10%|█ | 11879/118287 [00:02<00:23, 4524.08it/s] | |
10%|█ | 12170/118287 [00:02<00:23, 4519.76it/s] | |
11%|█ | 13161/118287 [00:02<00:26, 4017.95it/s] | |
10%|█ | 11863/118287 [00:02<00:23, 4473.77it/s] | |
6%|▋ | 7593/118287 [00:01<00:27, 4025.35it/s] | |
10%|▉ | 11647/118287 [00:02<00:26, 3975.63it/s] | |
7%|▋ | 7712/118287 [00:01<00:27, 4026.08it/s] | |
10%|█ | 12368/118287 [00:02<00:22, 4626.33it/s] | |
11%|█ | 12631/118287 [00:02<00:23, 4544.89it/s] | |
11%|█▏ | 13574/118287 [00:03<00:25, 4045.20it/s] | |
10%|█ | 12338/118287 [00:02<00:23, 4551.77it/s] | |
10%|█ | 12064/118287 [00:02<00:26, 4026.68it/s] | |
7%|▋ | 7997/118287 [00:02<00:27, 3944.44it/s] | |
7%|▋ | 8116/118287 [00:01<00:28, 3891.71it/s] | |
11%|█ | 12832/118287 [00:02<00:23, 4579.07it/s] | |
11%|█ | 13086/118287 [00:02<00:23, 4501.59it/s] | |
12%|█▏ | 13984/118287 [00:03<00:25, 4060.31it/s] | |
11%|█ | 12795/118287 [00:02<00:23, 4555.48it/s] | |
11%|█ | 12489/118287 [00:02<00:25, 4090.47it/s] | |
7%|▋ | 8393/118287 [00:02<00:28, 3842.28it/s] | |
7%|▋ | 8507/118287 [00:02<00:28, 3829.92it/s] | |
11%|█ | 13303/118287 [00:02<00:22, 4614.69it/s] | |
11%|█▏ | 13547/118287 [00:02<00:23, 4532.58it/s] | |
12%|█▏ | 14449/118287 [00:03<00:24, 4218.83it/s] | |
11%|█ | 13252/118287 [00:02<00:23, 4536.05it/s] | |
11%|█ | 12899/118287 [00:02<00:26, 3972.85it/s] | |
7%|▋ | 8779/118287 [00:02<00:29, 3775.81it/s] | |
8%|▊ | 8892/118287 [00:02<00:28, 3814.05it/s] | |
12%|█▏ | 13769/118287 [00:02<00:22, 4626.59it/s] | |
12%|█▏ | 14031/118287 [00:03<00:22, 4614.19it/s] | |
13%|█▎ | 14873/118287 [00:03<00:24, 4187.50it/s] | |
12%|█▏ | 13724/118287 [00:02<00:22, 4586.83it/s] | |
11%|█▏ | 13312/118287 [00:03<00:26, 4015.17it/s] | |
8%|▊ | 9162/118287 [00:02<00:28, 3791.21it/s] | |
8%|▊ | 9283/118287 [00:02<00:28, 3839.12it/s] | |
12%|█▏ | 14253/118287 [00:03<00:22, 4688.03it/s] | |
12%|█▏ | 14494/118287 [00:03<00:22, 4603.29it/s] | |
13%|█▎ | 15294/118287 [00:03<00:25, 4102.81it/s] | |
12%|█▏ | 14189/118287 [00:03<00:22, 4604.03it/s] | |
12%|█▏ | 13727/118287 [00:03<00:25, 4051.12it/s] | |
8%|▊ | 9596/118287 [00:02<00:27, 3939.40it/s] | |
8%|▊ | 9721/118287 [00:02<00:27, 3986.71it/s] | |
12%|█▏ | 14723/118287 [00:03<00:22, 4679.82it/s] | |
13%|█▎ | 14955/118287 [00:03<00:23, 4489.17it/s] | |
13%|█▎ | 15742/118287 [00:03<00:24, 4204.32it/s] | |
12%|█▏ | 14670/118287 [00:03<00:22, 4663.80it/s] | |
12%|█▏ | 14134/118287 [00:03<00:25, 4055.47it/s] | |
8%|▊ | 9992/118287 [00:02<00:27, 3894.33it/s] | |
9%|▊ | 10122/118287 [00:02<00:27, 3888.36it/s] | |
13%|█▎ | 15192/118287 [00:03<00:22, 4519.08it/s] | |
13%|█▎ | 15405/118287 [00:03<00:23, 4401.65it/s] | |
14%|█▎ | 16167/118287 [00:03<00:24, 4216.29it/s] | |
13%|█▎ | 15137/118287 [00:03<00:23, 4483.24it/s] | |
12%|█▏ | 14556/118287 [00:03<00:25, 4099.08it/s] | |
9%|▉ | 10383/118287 [00:02<00:27, 3866.09it/s] | |
9%|▉ | 10513/118287 [00:02<00:27, 3874.80it/s] | |
13%|█▎ | 15679/118287 [00:03<00:22, 4617.89it/s] | |
13%|█▎ | 15851/118287 [00:03<00:23, 4415.22it/s] | |
14%|█▍ | 16606/118287 [00:03<00:23, 4264.48it/s] | |
13%|█▎ | 15589/118287 [00:03<00:22, 4490.73it/s] | |
13%|█▎ | 14967/118287 [00:03<00:26, 3928.59it/s] | |
9%|▉ | 10771/118287 [00:02<00:28, 3833.96it/s] | |
9%|▉ | 10902/118287 [00:02<00:27, 3845.07it/s] | |
14%|█▎ | 16143/118287 [00:03<00:22, 4576.54it/s] | |
14%|█▍ | 16324/118287 [00:03<00:22, 4504.04it/s] | |
14%|█▍ | 17034/118287 [00:03<00:23, 4249.38it/s] | |
14%|█▎ | 16052/118287 [00:03<00:22, 4529.72it/s]Done (t=18.11s) | |
creating index... | |
13%|█▎ | 15362/118287 [00:03<00:26, 3886.78it/s] | |
9%|▉ | 11156/118287 [00:02<00:28, 3743.36it/s] | |
10%|▉ | 11288/118287 [00:02<00:28, 3764.44it/s] | |
14%|█▍ | 16628/118287 [00:03<00:21, 4650.87it/s] | |
14%|█▍ | 16776/118287 [00:03<00:22, 4478.96it/s] | |
15%|█▍ | 17460/118287 [00:04<00:24, 4106.79it/s] | |
14%|█▍ | 16532/118287 [00:03<00:22, 4599.95it/s] | |
13%|█▎ | 15771/118287 [00:03<00:26, 3940.52it/s] | |
10%|▉ | 11532/118287 [00:02<00:28, 3737.58it/s] | |
10%|▉ | 11673/118287 [00:02<00:28, 3788.07it/s] | |
14%|█▍ | 17099/118287 [00:03<00:21, 4667.75it/s] | |
15%|█▍ | 17225/118287 [00:03<00:22, 4481.36it/s] | |
14%|█▍ | 16993/118287 [00:03<00:22, 4597.97it/s] | |
15%|█▌ | 17873/118287 [00:04<00:24, 4057.15it/s] | |
14%|█▎ | 16187/118287 [00:03<00:25, 3998.99it/s] | |
10%|█ | 11927/118287 [00:03<00:28, 3797.52it/s] | |
10%|█ | 12066/118287 [00:02<00:27, 3824.85it/s] | |
15%|█▍ | 17567/118287 [00:03<00:21, 4647.76it/s] | |
15%|█▍ | 17674/118287 [00:03<00:22, 4442.64it/s] | |
15%|█▍ | 17454/118287 [00:03<00:21, 4585.27it/s] | |
15%|█▌ | 18288/118287 [00:04<00:24, 4083.30it/s] | |
14%|█▍ | 16607/118287 [00:03<00:25, 4055.59it/s] | |
10%|█ | 12312/118287 [00:03<00:27, 3812.62it/s] | |
11%|█ | 12477/118287 [00:03<00:27, 3901.34it/s] | |
15%|█▌ | 18033/118287 [00:03<00:21, 4631.18it/s] | |
15%|█▌ | 18133/118287 [00:03<00:22, 4484.06it/s] | |
15%|█▌ | 17913/118287 [00:03<00:21, 4572.68it/s] | |
16%|█▌ | 18698/118287 [00:04<00:24, 4069.03it/s] | |
14%|█▍ | 17034/118287 [00:04<00:24, 4116.03it/s] | |
11%|█ | 12694/118287 [00:03<00:27, 3779.17it/s] | |
11%|█ | 12868/118287 [00:03<00:27, 3816.73it/s] | |
16%|█▌ | 18499/118287 [00:03<00:21, 4635.08it/s] | |
16%|█▌ | 18595/118287 [00:04<00:22, 4518.98it/s] | |
16%|█▌ | 18378/118287 [00:03<00:21, 4595.27it/s] | |
16%|█▌ | 19106/118287 [00:04<00:24, 4054.59it/s] | |
15%|█▍ | 17447/118287 [00:04<00:25, 4023.97it/s] | |
11%|█ | 13073/118287 [00:03<00:27, 3771.04it/s] | |
11%|█ | 13264/118287 [00:03<00:27, 3856.38it/s] | |
16%|█▌ | 18984/118287 [00:04<00:21, 4696.52it/s] | |
16%|█▌ | 19053/118287 [00:04<00:21, 4536.61it/s] | |
16%|█▌ | 18838/118287 [00:04<00:21, 4596.69it/s] | |
16%|█▋ | 19512/118287 [00:04<00:24, 4024.69it/s] | |
15%|█▌ | 17851/118287 [00:04<00:25, 4007.71it/s] | |
11%|█▏ | 13460/118287 [00:03<00:27, 3796.06it/s] | |
12%|█▏ | 13658/118287 [00:03<00:26, 3877.75it/s] | |
16%|█▋ | 19455/118287 [00:04<00:21, 4700.13it/s] | |
16%|█▋ | 19508/118287 [00:04<00:21, 4536.03it/s] | |
16%|█▋ | 19312/118287 [00:04<00:21, 4633.01it/s] | |
17%|█▋ | 19916/118287 [00:04<00:24, 4029.24it/s] | |
15%|█▌ | 18279/118287 [00:04<00:24, 4085.02it/s] | |
12%|█▏ | 13848/118287 [00:03<00:27, 3816.70it/s] | |
12%|█▏ | 14067/118287 [00:03<00:26, 3935.94it/s] | |
17%|█▋ | 19926/118287 [00:04<00:20, 4687.77it/s] | |
17%|█▋ | 19962/118287 [00:04<00:21, 4522.95it/s] | |
17%|█▋ | 19776/118287 [00:04<00:21, 4585.41it/s] | |
17%|█▋ | 20320/118287 [00:04<00:24, 4026.92it/s] | |
16%|█▌ | 18694/118287 [00:04<00:24, 4103.00it/s] | |
12%|█▏ | 14253/118287 [00:03<00:26, 3881.42it/s] | |
12%|█▏ | 14469/118287 [00:03<00:26, 3959.40it/s] | |
17%|█▋ | 20410/118287 [00:04<00:20, 4729.07it/s] | |
17%|█▋ | 20431/118287 [00:04<00:21, 4570.13it/s] | |
17%|█▋ | 20235/118287 [00:04<00:21, 4573.16it/s] | |
18%|█▊ | 20736/118287 [00:04<00:24, 4064.62it/s] | |
16%|█▌ | 19107/118287 [00:04<00:24, 4110.72it/s] | |
12%|█▏ | 14652/118287 [00:03<00:26, 3912.93it/s] | |
13%|█▎ | 14866/118287 [00:03<00:26, 3876.74it/s] | |
18%|█▊ | 20884/118287 [00:04<00:20, 4660.36it/s] | |
18%|█▊ | 20889/118287 [00:04<00:21, 4506.44it/s] | |
18%|█▊ | 20703/118287 [00:04<00:21, 4603.78it/s] | |
18%|█▊ | 21143/118287 [00:04<00:24, 3964.77it/s] | |
17%|█▋ | 19526/118287 [00:04<00:23, 4133.25it/s] | |
13%|█▎ | 15044/118287 [00:03<00:27, 3702.91it/s] | |
18%|█▊ | 21351/118287 [00:04<00:20, 4619.38it/s] | |
13%|█▎ | 15255/118287 [00:03<00:27, 3756.69it/s] | |
18%|█▊ | 21340/118287 [00:04<00:21, 4480.49it/s]index created! | |
[32m[0308 18:58:21 @dataset.py:50][0m Instances loaded from /home/ubuntu/data/annotations/instances_train2017.json. | |
18%|█▊ | 21164/118287 [00:04<00:21, 4526.86it/s] | |
18%|█▊ | 21564/118287 [00:05<00:23, 4033.02it/s] | |
17%|█▋ | 19940/118287 [00:04<00:23, 4113.68it/s] | |
13%|█▎ | 15436/118287 [00:03<00:27, 3765.06it/s] | |
18%|█▊ | 21814/118287 [00:04<00:20, 4619.74it/s] | |
13%|█▎ | 15686/118287 [00:03<00:26, 3904.54it/s] | |
18%|█▊ | 21791/118287 [00:04<00:21, 4482.25it/s] | |
0%| | 0/118287 [00:00<?, ?it/s] | |
18%|█▊ | 21618/118287 [00:04<00:21, 4502.11it/s] | |
19%|█▊ | 21969/118287 [00:05<00:24, 3964.80it/s] | |
17%|█▋ | 20363/118287 [00:04<00:23, 4147.26it/s] | |
13%|█▎ | 15815/118287 [00:04<00:27, 3765.30it/s] | |
19%|█▉ | 22277/118287 [00:04<00:20, 4620.17it/s] | |
19%|█▉ | 22240/118287 [00:04<00:21, 4470.52it/s] | |
14%|█▎ | 16079/118287 [00:04<00:26, 3857.84it/s] | |
0%| | 378/118287 [00:00<00:31, 3777.42it/s] | |
19%|█▊ | 22069/118287 [00:04<00:21, 4468.37it/s] | |
19%|█▉ | 22367/118287 [00:05<00:24, 3942.21it/s] | |
18%|█▊ | 20778/118287 [00:04<00:23, 4129.74it/s] | |
14%|█▎ | 16219/118287 [00:04<00:26, 3837.78it/s] | |
19%|█▉ | 22753/118287 [00:04<00:20, 4660.60it/s] | |
19%|█▉ | 22710/118287 [00:04<00:21, 4535.71it/s] | |
14%|█▍ | 16492/118287 [00:04<00:25, 3934.56it/s] | |
1%| | 720/118287 [00:00<00:32, 3658.74it/s] | |
19%|█▉ | 22517/118287 [00:04<00:21, 4437.26it/s] | |
19%|█▉ | 22782/118287 [00:05<00:23, 4000.27it/s] | |
18%|█▊ | 21192/118287 [00:05<00:23, 4082.27it/s] | |
14%|█▍ | 16605/118287 [00:04<00:26, 3842.90it/s] | |
20%|█▉ | 23220/118287 [00:04<00:20, 4643.14it/s] | |
20%|█▉ | 23165/118287 [00:05<00:20, 4538.59it/s] | |
14%|█▍ | 16888/118287 [00:04<00:25, 3928.44it/s] | |
1%| | 1059/118287 [00:00<00:32, 3572.89it/s] | |
19%|█▉ | 22961/118287 [00:04<00:21, 4382.69it/s] | |
20%|█▉ | 23201/118287 [00:05<00:23, 4054.41it/s] | |
18%|█▊ | 21603/118287 [00:05<00:23, 4087.26it/s] | |
14%|█▍ | 16991/118287 [00:04<00:26, 3821.54it/s] | |
20%|█▉ | 23620/118287 [00:05<00:20, 4512.38it/s] | |
20%|██ | 23685/118287 [00:05<00:20, 4562.56it/s] | |
15%|█▍ | 17282/118287 [00:04<00:25, 3913.40it/s] | |
1%| | 1423/118287 [00:00<00:32, 3585.79it/s] | |
20%|█▉ | 23400/118287 [00:05<00:21, 4372.56it/s] | |
20%|█▉ | 23608/118287 [00:05<00:23, 4020.98it/s] | |
19%|█▊ | 22012/118287 [00:05<00:23, 4038.56it/s] | |
15%|█▍ | 17374/118287 [00:04<00:26, 3804.49it/s] | |
20%|██ | 24174/118287 [00:05<00:20, 4653.99it/s] | |
20%|██ | 24072/118287 [00:05<00:21, 4482.33it/s] | |
15%|█▍ | 17675/118287 [00:04<00:25, 3890.45it/s] | |
2%|▏ | 1820/118287 [00:00<00:31, 3691.58it/s] | |
20%|██ | 23838/118287 [00:05<00:21, 4324.14it/s] | |
20%|██ | 24011/118287 [00:05<00:23, 3985.67it/s] | |
19%|█▉ | 22425/118287 [00:05<00:23, 4052.75it/s] | |
15%|█▌ | 17755/118287 [00:04<00:26, 3760.62it/s] | |
21%|██ | 24684/118287 [00:05<00:19, 4777.64it/s] | |
21%|██ | 24566/118287 [00:05<00:20, 4608.89it/s] | |
15%|█▌ | 18089/118287 [00:04<00:25, 3961.38it/s] | |
2%|▏ | 2193/118287 [00:00<00:31, 3702.92it/s] | |
21%|██ | 24273/118287 [00:05<00:21, 4330.07it/s] | |
21%|██ | 24438/118287 [00:05<00:23, 4060.61it/s] | |
19%|█▉ | 22837/118287 [00:05<00:23, 4071.07it/s] | |
15%|█▌ | 18146/118287 [00:04<00:26, 3800.78it/s] | |
21%|██ | 25028/118287 [00:05<00:20, 4568.20it/s] | |
21%|██▏ | 25164/118287 [00:05<00:19, 4694.12it/s] | |
16%|█▌ | 18486/118287 [00:04<00:25, 3926.95it/s] | |
2%|▏ | 2583/118287 [00:00<00:30, 3758.77it/s] | |
21%|██ | 24719/118287 [00:05<00:21, 4367.83it/s] | |
21%|██ | 24858/118287 [00:05<00:22, 4099.47it/s] | |
20%|█▉ | 23258/118287 [00:05<00:23, 4111.77it/s] | |
16%|█▌ | 18531/118287 [00:04<00:26, 3813.37it/s] | |
22%|██▏ | 25488/118287 [00:05<00:20, 4576.84it/s] | |
16%|█▌ | 18880/118287 [00:04<00:25, 3922.56it/s] | |
3%|▎ | 2983/118287 [00:00<00:30, 3823.50it/s] | |
22%|██▏ | 25635/118287 [00:05<00:20, 4593.94it/s] | |
21%|██▏ | 25269/118287 [00:05<00:22, 4061.38it/s] | |
21%|██▏ | 25157/118287 [00:05<00:21, 4260.28it/s] | |
20%|██ | 23670/118287 [00:05<00:23, 4016.02it/s] | |
16%|█▌ | 18925/118287 [00:04<00:25, 3849.81it/s] | |
22%|██▏ | 25951/118287 [00:05<00:20, 4590.59it/s] | |
16%|█▋ | 19275/118287 [00:04<00:25, 3930.02it/s] | |
3%|▎ | 3345/118287 [00:00<00:30, 3753.14it/s] | |
22%|██▏ | 26137/118287 [00:05<00:19, 4711.18it/s] | |
22%|██▏ | 25679/118287 [00:06<00:22, 4069.89it/s] | |
22%|██▏ | 25584/118287 [00:05<00:21, 4259.94it/s] | |
20%|██ | 24086/118287 [00:05<00:23, 4057.61it/s] | |
16%|█▋ | 19320/118287 [00:04<00:25, 3878.84it/s] | |
22%|██▏ | 26428/118287 [00:05<00:19, 4642.72it/s] | |
17%|█▋ | 19684/118287 [00:04<00:24, 3976.64it/s] | |
22%|██▏ | 26610/118287 [00:05<00:19, 4650.62it/s] | |
3%|▎ | 3706/118287 [00:01<00:32, 3506.04it/s] | |
22%|██▏ | 26132/118287 [00:06<00:21, 4195.13it/s] | |
22%|██▏ | 26011/118287 [00:05<00:21, 4245.36it/s] | |
21%|██ | 24533/118287 [00:05<00:22, 4172.26it/s] | |
17%|█▋ | 19709/118287 [00:05<00:25, 3881.12it/s] | |
23%|██▎ | 26894/118287 [00:05<00:19, 4642.06it/s] | |
17%|█▋ | 20083/118287 [00:05<00:24, 3948.12it/s] | |
23%|██▎ | 27088/118287 [00:05<00:19, 4688.63it/s] | |
3%|▎ | 4050/118287 [00:01<00:33, 3379.52it/s] | |
22%|██▏ | 26436/118287 [00:05<00:21, 4243.39it/s] | |
22%|██▏ | 26553/118287 [00:06<00:22, 4110.57it/s] | |
21%|██ | 24952/118287 [00:05<00:22, 4132.21it/s] | |
17%|█▋ | 20098/118287 [00:05<00:25, 3875.75it/s] | |
23%|██▎ | 27359/118287 [00:05<00:19, 4614.99it/s] | |
17%|█▋ | 20479/118287 [00:05<00:24, 3947.71it/s] | |
23%|██▎ | 27568/118287 [00:05<00:19, 4721.48it/s] | |
4%|▎ | 4385/118287 [00:01<00:34, 3303.19it/s] | |
23%|██▎ | 26861/118287 [00:05<00:21, 4210.57it/s] | |
23%|██▎ | 26979/118287 [00:06<00:22, 4150.31it/s] | |
21%|██▏ | 25369/118287 [00:06<00:22, 4141.83it/s] | |
17%|█▋ | 20487/118287 [00:05<00:25, 3879.78it/s] | |
24%|██▎ | 27821/118287 [00:06<00:19, 4598.43it/s] | |
18%|█▊ | 20874/118287 [00:05<00:25, 3863.74it/s] | |
24%|██▎ | 28041/118287 [00:06<00:19, 4644.37it/s] | |
23%|██▎ | 27406/118287 [00:06<00:21, 4184.42it/s] | |
23%|██▎ | 27283/118287 [00:06<00:21, 4149.70it/s] | |
4%|▍ | 4714/118287 [00:01<00:35, 3183.09it/s] | |
22%|██▏ | 25788/118287 [00:06<00:22, 4152.49it/s] | |
18%|█▊ | 20876/118287 [00:05<00:25, 3833.19it/s] | |
24%|██▍ | 28282/118287 [00:06<00:19, 4572.52it/s] | |
18%|█▊ | 21261/118287 [00:05<00:25, 3827.93it/s] | |
24%|██▍ | 28520/118287 [00:06<00:19, 4680.32it/s] | |
24%|██▎ | 27826/118287 [00:06<00:21, 4162.64it/s] | |
4%|▍ | 5046/118287 [00:01<00:35, 3221.64it/s] | |
23%|██▎ | 27699/118287 [00:06<00:22, 4080.32it/s] | |
22%|██▏ | 26221/118287 [00:06<00:21, 4204.13it/s] | |
18%|█▊ | 21260/118287 [00:05<00:25, 3766.14it/s] | |
24%|██▍ | 28760/118287 [00:06<00:19, 4629.80it/s] | |
18%|█▊ | 21646/118287 [00:05<00:25, 3819.92it/s] | |
25%|██▍ | 29003/118287 [00:06<00:18, 4721.46it/s] | |
24%|██▍ | 28248/118287 [00:06<00:21, 4179.37it/s] | |
5%|▍ | 5389/118287 [00:01<00:34, 3279.58it/s] | |
24%|██▍ | 28108/118287 [00:06<00:22, 3995.96it/s] | |
23%|██▎ | 26642/118287 [00:06<00:21, 4183.81it/s] | |
18%|█▊ | 21644/118287 [00:05<00:25, 3787.20it/s] | |
25%|██▍ | 29226/118287 [00:06<00:19, 4637.55it/s] | |
19%|█▊ | 22029/118287 [00:05<00:25, 3779.08it/s] | |
25%|██▍ | 29476/118287 [00:06<00:19, 4644.44it/s] | |
24%|██▍ | 28667/118287 [00:06<00:21, 4174.07it/s] | |
5%|▍ | 5738/118287 [00:01<00:33, 3338.34it/s] | |
24%|██▍ | 28509/118287 [00:06<00:22, 3996.11it/s] | |
23%|██▎ | 27086/118287 [00:06<00:21, 4254.51it/s] | |
19%|█▊ | 22024/118287 [00:05<00:25, 3736.54it/s] | |
25%|██▌ | 29690/118287 [00:06<00:19, 4532.12it/s] | |
19%|█▉ | 22409/118287 [00:05<00:25, 3782.80it/s] | |
25%|██▌ | 29942/118287 [00:06<00:19, 4638.09it/s] | |
25%|██▍ | 29100/118287 [00:06<00:21, 4218.90it/s] | |
5%|▌ | 6128/118287 [00:01<00:32, 3486.76it/s] | |
24%|██▍ | 28913/118287 [00:06<00:22, 4009.03it/s] | |
23%|██▎ | 27512/118287 [00:06<00:21, 4231.03it/s] | |
19%|█▉ | 22410/118287 [00:05<00:25, 3772.17it/s] | |
25%|██▌ | 30144/118287 [00:06<00:19, 4455.38it/s] | |
19%|█▉ | 22788/118287 [00:05<00:25, 3784.28it/s] | |
26%|██▌ | 30407/118287 [00:06<00:19, 4605.03it/s] | |
5%|▌ | 6504/118287 [00:01<00:31, 3564.24it/s] | |
25%|██▍ | 29523/118287 [00:06<00:21, 4174.94it/s] | |
25%|██▍ | 29315/118287 [00:06<00:22, 3991.81it/s] | |
24%|██▎ | 27936/118287 [00:06<00:21, 4218.31it/s] | |
19%|█▉ | 22798/118287 [00:05<00:25, 3800.10it/s] | |
26%|██▌ | 30633/118287 [00:06<00:19, 4575.96it/s] | |
20%|█▉ | 23167/118287 [00:05<00:25, 3777.59it/s] | |
26%|██▌ | 30905/118287 [00:06<00:18, 4700.39it/s] | |
6%|▌ | 6864/118287 [00:01<00:31, 3571.53it/s] | |
25%|██▌ | 29947/118287 [00:07<00:21, 4191.62it/s] | |
25%|██▌ | 29715/118287 [00:06<00:22, 3958.16it/s] | |
24%|██▍ | 28359/118287 [00:06<00:21, 4204.61it/s] | |
20%|█▉ | 23185/118287 [00:06<00:24, 3819.49it/s] | |
26%|██▋ | 31092/118287 [00:06<00:19, 4557.06it/s] | |
20%|█▉ | 23545/118287 [00:05<00:25, 3764.50it/s] | |
27%|██▋ | 31376/118287 [00:06<00:18, 4649.85it/s] | |
6%|▌ | 7223/118287 [00:02<00:31, 3526.49it/s] | |
26%|██▌ | 30367/118287 [00:07<00:21, 4162.19it/s] | |
25%|██▌ | 30112/118287 [00:06<00:22, 3858.25it/s] | |
24%|██▍ | 28801/118287 [00:06<00:20, 4263.40it/s] | |
20%|█▉ | 23568/118287 [00:06<00:24, 3797.31it/s] | |
27%|██▋ | 31553/118287 [00:06<00:18, 4568.18it/s] | |
20%|██ | 23922/118287 [00:06<00:25, 3728.69it/s] | |
27%|██▋ | 31842/118287 [00:06<00:18, 4622.21it/s] | |
26%|██▌ | 30802/118287 [00:07<00:20, 4216.65it/s] | |
6%|▋ | 7577/118287 [00:02<00:31, 3473.52it/s] | |
26%|██▌ | 30541/118287 [00:06<00:22, 3976.67it/s] | |
25%|██▍ | 29228/118287 [00:06<00:21, 4240.02it/s] | |
20%|██ | 23948/118287 [00:06<00:24, 3774.39it/s] | |
21%|██ | 24320/118287 [00:06<00:24, 3800.19it/s] | |
27%|██▋ | 32011/118287 [00:07<00:19, 4461.00it/s] | |
27%|██▋ | 32305/118287 [00:06<00:18, 4574.39it/s] | |
26%|██▋ | 31224/118287 [00:07<00:20, 4204.81it/s] | |
7%|▋ | 7926/118287 [00:02<00:31, 3451.33it/s] | |
26%|██▌ | 30950/118287 [00:06<00:21, 4005.71it/s] | |
25%|██▌ | 29653/118287 [00:07<00:21, 4138.84it/s] | |
21%|██ | 24355/118287 [00:06<00:24, 3857.65it/s] | |
21%|██ | 24717/118287 [00:06<00:24, 3849.22it/s] | |
27%|██▋ | 32459/118287 [00:07<00:19, 4465.68it/s] | |
28%|██▊ | 32763/118287 [00:07<00:18, 4562.88it/s] | |
27%|██▋ | 31659/118287 [00:07<00:20, 4245.66it/s] | |
7%|▋ | 8272/118287 [00:02<00:32, 3437.51it/s] | |
27%|██▋ | 31352/118287 [00:07<00:21, 3972.66it/s] | |
25%|██▌ | 30068/118287 [00:07<00:21, 4087.49it/s] | |
21%|██ | 24763/118287 [00:06<00:23, 3920.02it/s] | |
21%|██ | 25103/118287 [00:06<00:24, 3837.99it/s] | |
28%|██▊ | 32907/118287 [00:07<00:19, 4451.04it/s] | |
28%|██▊ | 33255/118287 [00:07<00:18, 4662.88it/s] | |
27%|██▋ | 32084/118287 [00:07<00:20, 4158.15it/s] | |
7%|▋ | 8617/118287 [00:02<00:32, 3393.99it/s] | |
27%|██▋ | 31755/118287 [00:07<00:21, 3989.69it/s] | |
26%|██▌ | 30508/118287 [00:07<00:21, 4174.84it/s] | |
21%|██▏ | 25156/118287 [00:06<00:24, 3843.74it/s] | |
22%|██▏ | 25493/118287 [00:06<00:24, 3855.21it/s] | |
28%|██▊ | 33403/118287 [00:07<00:18, 4589.42it/s] | |
29%|██▊ | 33732/118287 [00:07<00:18, 4694.07it/s] | |
27%|██▋ | 32501/118287 [00:07<00:20, 4135.73it/s] | |
8%|▊ | 8963/118287 [00:02<00:32, 3411.58it/s] | |
27%|██▋ | 32155/118287 [00:07<00:22, 3906.86it/s] | |
26%|██▌ | 30927/118287 [00:07<00:20, 4176.91it/s] | |
22%|██▏ | 25542/118287 [00:06<00:24, 3806.56it/s] | |
22%|██▏ | 25879/118287 [00:06<00:24, 3821.90it/s] | |
29%|██▊ | 33874/118287 [00:07<00:18, 4621.01it/s] | |
29%|██▉ | 34227/118287 [00:07<00:17, 4766.33it/s] | |
28%|██▊ | 32916/118287 [00:07<00:20, 4086.57it/s] | |
8%|▊ | 9312/118287 [00:02<00:31, 3431.77it/s] | |
28%|██▊ | 32551/118287 [00:07<00:21, 3918.31it/s] | |
26%|██▋ | 31346/118287 [00:07<00:20, 4146.45it/s] | |
22%|██▏ | 25924/118287 [00:06<00:24, 3808.72it/s] | |
22%|██▏ | 26299/118287 [00:06<00:23, 3926.61it/s] | |
29%|██▉ | 34349/118287 [00:07<00:18, 4656.85it/s] | |
29%|██▉ | 34746/118287 [00:07<00:17, 4885.19it/s] | |
28%|██▊ | 33383/118287 [00:07<00:19, 4245.22it/s] | |
8%|▊ | 9715/118287 [00:02<00:30, 3590.07it/s] | |
28%|██▊ | 32944/118287 [00:07<00:21, 3907.85it/s] | |
27%|██▋ | 31770/118287 [00:07<00:20, 4168.74it/s] | |
22%|██▏ | 26330/118287 [00:06<00:23, 3880.73it/s] | |
29%|██▉ | 34850/118287 [00:07<00:17, 4754.83it/s] | |
23%|██▎ | 26693/118287 [00:06<00:23, 3891.08it/s] | |
30%|██▉ | 35236/118287 [00:07<00:17, 4821.75it/s] | |
29%|██▊ | 33821/118287 [00:07<00:19, 4283.28it/s] | |
9%|▊ | 10077/118287 [00:02<00:30, 3520.71it/s] | |
28%|██▊ | 33385/118287 [00:07<00:21, 4038.99it/s] | |
27%|██▋ | 32188/118287 [00:07<00:21, 4090.52it/s] | |
23%|██▎ | 26724/118287 [00:06<00:23, 3897.47it/s] | |
23%|██▎ | 27096/118287 [00:06<00:23, 3921.09it/s] | |
30%|██▉ | 35327/118287 [00:07<00:17, 4709.32it/s] | |
30%|███ | 35720/118287 [00:07<00:17, 4822.48it/s] | |
29%|██▉ | 34275/118287 [00:08<00:19, 4354.38it/s] | |
9%|▉ | 10432/118287 [00:02<00:30, 3527.47it/s] | |
29%|██▊ | 33791/118287 [00:07<00:20, 4042.22it/s] | |
28%|██▊ | 32598/118287 [00:07<00:20, 4089.91it/s] | |
23%|██▎ | 27120/118287 [00:07<00:23, 3914.70it/s] | |
23%|██▎ | 27490/118287 [00:06<00:23, 3922.69it/s] | |
30%|███ | 35811/118287 [00:07<00:17, 4747.51it/s] | |
31%|███ | 36219/118287 [00:07<00:16, 4868.91it/s] | |
29%|██▉ | 34749/118287 [00:08<00:18, 4458.07it/s] | |
9%|▉ | 10786/118287 [00:03<00:30, 3522.57it/s] | |
29%|██▉ | 34219/118287 [00:07<00:20, 4109.97it/s] | |
28%|██▊ | 33008/118287 [00:07<00:20, 4081.41it/s] | |
23%|██▎ | 27525/118287 [00:07<00:22, 3951.74it/s] | |
31%|███ | 36287/118287 [00:07<00:17, 4717.73it/s] | |
24%|██▎ | 27883/118287 [00:07<00:23, 3890.27it/s] | |
31%|███ | 36707/118287 [00:07<00:17, 4768.22it/s] | |
30%|██▉ | 35197/118287 [00:08<00:19, 4358.70it/s] | |
9%|▉ | 11140/118287 [00:03<00:30, 3470.68it/s] | |
29%|██▉ | 34669/118287 [00:07<00:19, 4217.82it/s] | |
28%|██▊ | 33463/118287 [00:07<00:20, 4210.83it/s] | |
24%|██▎ | 27921/118287 [00:07<00:23, 3912.06it/s] | |
24%|██▍ | 28273/118287 [00:07<00:23, 3875.46it/s] | |
31%|███ | 36760/118287 [00:08<00:17, 4642.33it/s] | |
31%|███▏ | 37185/118287 [00:07<00:17, 4658.50it/s] | |
30%|███ | 35635/118287 [00:08<00:19, 4337.70it/s] | |
10%|▉ | 11488/118287 [00:03<00:31, 3436.83it/s] | |
30%|██▉ | 35093/118287 [00:07<00:20, 4130.32it/s] | |
29%|██▊ | 33901/118287 [00:08<00:19, 4255.18it/s] | |
24%|██▍ | 28314/118287 [00:07<00:22, 3915.26it/s] | |
24%|██▍ | 28666/118287 [00:07<00:23, 3884.65it/s] | |
31%|███▏ | 37225/118287 [00:08<00:17, 4557.24it/s] | |
32%|███▏ | 37653/118287 [00:08<00:17, 4611.46it/s] | |
30%|███ | 36070/118287 [00:08<00:19, 4307.16it/s] | |
10%|█ | 11867/118287 [00:03<00:30, 3530.22it/s] | |
30%|███ | 35508/118287 [00:08<00:20, 4111.77it/s] | |
29%|██▉ | 34349/118287 [00:08<00:19, 4315.90it/s] | |
24%|██▍ | 28720/118287 [00:07<00:22, 3956.68it/s] | |
25%|██▍ | 29055/118287 [00:07<00:23, 3867.30it/s] | |
32%|███▏ | 37682/118287 [00:08<00:17, 4511.23it/s] | |
32%|███▏ | 38116/118287 [00:08<00:17, 4606.37it/s] | |
10%|█ | 12252/118287 [00:03<00:29, 3615.37it/s] | |
31%|███ | 36502/118287 [00:08<00:19, 4244.07it/s] | |
30%|███ | 35946/118287 [00:08<00:19, 4187.65it/s] | |
29%|██▉ | 34811/118287 [00:08<00:18, 4402.09it/s] | |
25%|██▍ | 29118/118287 [00:07<00:22, 3957.92it/s] | |
25%|██▍ | 29442/118287 [00:07<00:23, 3859.51it/s] | |
32%|███▏ | 38134/118287 [00:08<00:17, 4505.13it/s] | |
33%|███▎ | 38619/118287 [00:08<00:16, 4723.28it/s] | |
11%|█ | 12617/118287 [00:03<00:29, 3625.22it/s] | |
31%|███ | 36928/118287 [00:08<00:19, 4087.18it/s] | |
31%|███ | 36366/118287 [00:08<00:19, 4116.52it/s] | |
30%|██▉ | 35253/118287 [00:08<00:18, 4385.26it/s] | |
25%|██▍ | 29514/118287 [00:07<00:22, 3884.27it/s] | |
25%|██▌ | 29829/118287 [00:07<00:23, 3833.62it/s] | |
33%|███▎ | 38626/118287 [00:08<00:17, 4620.82it/s] | |
33%|███▎ | 39093/118287 [00:08<00:17, 4652.70it/s] | |
11%|█ | 12981/118287 [00:03<00:29, 3589.02it/s] | |
32%|███▏ | 37339/118287 [00:08<00:19, 4057.27it/s] | |
31%|███ | 36779/118287 [00:08<00:19, 4083.05it/s] | |
30%|███ | 35694/118287 [00:08<00:18, 4388.96it/s] | |
25%|██▌ | 29903/118287 [00:07<00:22, 3855.44it/s] | |
26%|██▌ | 30213/118287 [00:07<00:23, 3758.97it/s] | |
33%|███▎ | 39090/118287 [00:08<00:17, 4624.85it/s] | |
33%|███▎ | 39578/118287 [00:08<00:16, 4705.18it/s] | |
11%|█▏ | 13341/118287 [00:03<00:29, 3539.73it/s] | |
31%|███ | 36145/118287 [00:08<00:18, 4422.98it/s] | |
31%|███▏ | 37189/118287 [00:08<00:20, 4035.71it/s] | |
32%|███▏ | 37747/118287 [00:08<00:20, 3949.20it/s] | |
26%|██▌ | 30289/118287 [00:07<00:22, 3846.42it/s] | |
26%|██▌ | 30639/118287 [00:07<00:22, 3894.18it/s] | |
33%|███▎ | 39554/118287 [00:08<00:17, 4620.91it/s] | |
34%|███▍ | 40057/118287 [00:08<00:16, 4730.08it/s] | |
12%|█▏ | 13705/118287 [00:03<00:29, 3564.38it/s] | |
32%|███▏ | 37597/118287 [00:08<00:19, 4048.19it/s] | |
32%|███▏ | 38144/118287 [00:09<00:20, 3948.30it/s] | |
31%|███ | 36588/118287 [00:08<00:18, 4316.27it/s] | |
26%|██▌ | 30718/118287 [00:07<00:22, 3962.49it/s] | |
26%|██▌ | 31033/118287 [00:07<00:22, 3907.69it/s] | |
34%|███▍ | 40017/118287 [00:08<00:16, 4615.71it/s] | |
34%|███▍ | 40531/118287 [00:08<00:16, 4664.76it/s] | |
12%|█▏ | 14066/118287 [00:03<00:29, 3574.11it/s] | |
33%|███▎ | 38568/118287 [00:09<00:19, 4020.44it/s] | |
32%|███▏ | 38003/118287 [00:08<00:20, 3963.24it/s] | |
31%|███▏ | 37021/118287 [00:08<00:19, 4226.97it/s] | |
26%|██▋ | 31116/118287 [00:08<00:22, 3915.21it/s] | |
27%|██▋ | 31425/118287 [00:07<00:22, 3889.81it/s] | |
34%|███▍ | 40479/118287 [00:08<00:16, 4582.88it/s] | |
35%|███▍ | 40999/118287 [00:08<00:16, 4641.49it/s] | |
12%|█▏ | 14426/118287 [00:04<00:29, 3580.68it/s] | |
33%|███▎ | 38984/118287 [00:09<00:19, 4057.04it/s] | |
32%|███▏ | 38441/118287 [00:08<00:19, 4072.18it/s] | |
32%|███▏ | 37445/118287 [00:08<00:19, 4222.31it/s] | |
27%|██▋ | 31509/118287 [00:08<00:22, 3906.23it/s] | |
27%|██▋ | 31819/118287 [00:08<00:22, 3898.83it/s] | |
35%|███▍ | 40938/118287 [00:08<00:17, 4507.18it/s] | |
35%|███▌ | 41467/118287 [00:08<00:16, 4648.85it/s] | |
12%|█▏ | 14785/118287 [00:04<00:29, 3525.90it/s] | |
33%|███▎ | 39404/118287 [00:09<00:19, 4079.49it/s] | |
33%|███▎ | 38881/118287 [00:08<00:19, 4164.36it/s] | |
32%|███▏ | 37868/118287 [00:08<00:19, 4155.26it/s] | |
27%|██▋ | 31904/118287 [00:08<00:22, 3916.87it/s] | |
35%|███▌ | 41410/118287 [00:09<00:16, 4561.25it/s] | |
27%|██▋ | 32210/118287 [00:08<00:22, 3785.56it/s] | |
35%|███▌ | 41938/118287 [00:08<00:16, 4660.38it/s] | |
13%|█▎ | 15138/118287 [00:04<00:30, 3425.64it/s] | |
34%|███▎ | 39834/118287 [00:09<00:18, 4141.98it/s] | |
33%|███▎ | 39300/118287 [00:08<00:18, 4170.43it/s] | |
32%|███▏ | 38297/118287 [00:09<00:19, 4192.24it/s] | |
27%|██▋ | 32297/118287 [00:08<00:22, 3839.08it/s] | |
35%|███▌ | 41875/118287 [00:09<00:16, 4582.45it/s] | |
28%|██▊ | 32609/118287 [00:08<00:22, 3842.16it/s] | |
36%|███▌ | 42405/118287 [00:09<00:16, 4659.24it/s] | |
13%|█▎ | 15482/118287 [00:04<00:30, 3388.82it/s] | |
34%|███▍ | 40249/118287 [00:09<00:19, 4092.33it/s] | |
34%|███▎ | 39719/118287 [00:09<00:18, 4148.40it/s] | |
33%|███▎ | 38753/118287 [00:09<00:18, 4294.23it/s] | |
28%|██▊ | 32689/118287 [00:08<00:22, 3859.27it/s] | |
36%|███▌ | 42334/118287 [00:09<00:16, 4547.69it/s] | |
28%|██▊ | 32995/118287 [00:08<00:22, 3794.99it/s] | |
36%|███▋ | 42887/118287 [00:09<00:16, 4704.09it/s] | |
13%|█▎ | 15822/118287 [00:04<00:30, 3376.64it/s] | |
34%|███▍ | 40135/118287 [00:09<00:18, 4131.73it/s] | |
34%|███▍ | 40659/118287 [00:09<00:19, 4019.13it/s] | |
33%|███▎ | 39184/118287 [00:09<00:18, 4296.98it/s] | |
28%|██▊ | 33079/118287 [00:08<00:22, 3869.89it/s] | |
36%|███▌ | 42800/118287 [00:09<00:16, 4563.20it/s] | |
28%|██▊ | 33425/118287 [00:08<00:21, 3932.33it/s] | |
37%|███▋ | 43358/118287 [00:09<00:15, 4698.86it/s] | |
14%|█▎ | 16176/118287 [00:04<00:29, 3422.81it/s] | |
35%|███▍ | 41099/118287 [00:09<00:18, 4125.65it/s] | |
34%|███▍ | 40549/118287 [00:09<00:19, 4073.62it/s] | |
33%|███▎ | 39616/118287 [00:09<00:18, 4295.97it/s] | |
28%|██▊ | 33502/118287 [00:08<00:21, 3970.02it/s] | |
37%|███▋ | 43276/118287 [00:09<00:16, 4620.50it/s] | |
29%|██▊ | 33821/118287 [00:08<00:21, 3932.14it/s] | |
37%|███▋ | 43829/118287 [00:09<00:15, 4681.44it/s] | |
14%|█▍ | 16535/118287 [00:04<00:29, 3470.97it/s] | |
35%|███▍ | 40957/118287 [00:09<00:18, 4073.54it/s] | |
35%|███▌ | 41513/118287 [00:09<00:18, 4091.20it/s] | |
34%|███▍ | 40068/118287 [00:09<00:17, 4358.64it/s] | |
29%|██▊ | 33903/118287 [00:08<00:21, 3981.22it/s] | |
29%|██▉ | 34234/118287 [00:08<00:21, 3983.78it/s] | |
37%|███▋ | 43739/118287 [00:09<00:16, 4586.60it/s] | |
37%|███▋ | 44298/118287 [00:09<00:15, 4656.93it/s] | |
14%|█▍ | 16897/118287 [00:04<00:28, 3513.57it/s] | |
35%|███▍ | 41386/118287 [00:09<00:18, 4133.13it/s] | |
35%|███▌ | 41932/118287 [00:09<00:18, 4120.05it/s] | |
34%|███▍ | 40505/118287 [00:09<00:18, 4311.73it/s] | |
29%|██▉ | 34318/118287 [00:08<00:20, 4030.10it/s] | |
29%|██▉ | 34676/118287 [00:08<00:20, 4098.74it/s] | |
37%|███▋ | 44198/118287 [00:09<00:16, 4556.28it/s] | |
38%|███▊ | 44785/118287 [00:09<00:15, 4716.78it/s] | |
15%|█▍ | 17254/118287 [00:04<00:28, 3527.56it/s] | |
35%|███▌ | 41800/118287 [00:09<00:18, 4114.18it/s] | |
36%|███▌ | 42345/118287 [00:10<00:18, 4084.29it/s] | |
29%|██▉ | 34754/118287 [00:08<00:20, 4116.87it/s] | |
35%|███▍ | 40937/118287 [00:09<00:18, 4249.83it/s] | |
38%|███▊ | 44654/118287 [00:09<00:16, 4548.64it/s] | |
30%|██▉ | 35088/118287 [00:08<00:20, 4013.48it/s] | |
38%|███▊ | 45257/118287 [00:09<00:15, 4689.42it/s] | |
15%|█▍ | 17608/118287 [00:05<00:29, 3464.22it/s] | |
36%|███▌ | 42771/118287 [00:10<00:18, 4131.39it/s] | |
36%|███▌ | 42212/118287 [00:09<00:18, 4061.59it/s] | |
35%|███▍ | 41389/118287 [00:09<00:17, 4327.02it/s] | |
30%|██▉ | 35167/118287 [00:09<00:20, 4053.04it/s] | |
38%|███▊ | 45142/118287 [00:09<00:15, 4642.99it/s] | |
30%|███ | 35494/118287 [00:08<00:20, 4022.01it/s] | |
39%|███▊ | 45727/118287 [00:09<00:15, 4640.33it/s] | |
15%|█▌ | 17959/118287 [00:05<00:28, 3477.27it/s] | |
37%|███▋ | 43185/118287 [00:10<00:18, 4130.46it/s] | |
36%|███▌ | 42621/118287 [00:09<00:18, 4067.89it/s] | |
35%|███▌ | 41823/118287 [00:09<00:17, 4316.95it/s] | |
30%|███ | 35574/118287 [00:09<00:20, 4040.88it/s] | |
30%|███ | 35908/118287 [00:09<00:20, 4056.02it/s] | |
39%|███▊ | 45607/118287 [00:09<00:15, 4562.31it/s] | |
39%|███▉ | 46226/118287 [00:09<00:15, 4738.14it/s] | |
15%|█▌ | 18313/118287 [00:05<00:28, 3493.34it/s] | |
37%|███▋ | 43599/118287 [00:10<00:18, 4120.17it/s] | |
36%|███▋ | 43032/118287 [00:09<00:18, 4079.51it/s] | |
36%|███▌ | 42256/118287 [00:10<00:17, 4306.08it/s] | |
30%|███ | 36003/118287 [00:09<00:20, 4109.10it/s] | |
39%|███▉ | 46078/118287 [00:10<00:15, 4600.39it/s] | |
31%|███ | 36315/118287 [00:09<00:20, 4025.55it/s] | |
40%|███▉ | 46729/118287 [00:09<00:14, 4821.63it/s] | |
16%|█▌ | 18663/118287 [00:05<00:29, 3425.54it/s] | |
37%|███▋ | 43441/118287 [00:10<00:18, 4051.01it/s] | |
37%|███▋ | 44012/118287 [00:10<00:18, 4070.29it/s] | |
36%|███▌ | 42696/118287 [00:10<00:17, 4333.30it/s] | |
31%|███ | 36415/118287 [00:09<00:20, 4032.35it/s] | |
39%|███▉ | 46577/118287 [00:10<00:15, 4709.67it/s] | |
31%|███ | 36719/118287 [00:09<00:20, 3993.49it/s] | |
40%|███▉ | 47213/118287 [00:10<00:14, 4739.48it/s] | |
16%|█▌ | 19013/118287 [00:05<00:28, 3447.06it/s] | |
38%|███▊ | 44427/118287 [00:10<00:18, 4089.03it/s] | |
37%|███▋ | 43847/118287 [00:10<00:18, 4014.32it/s] | |
36%|███▋ | 43144/118287 [00:10<00:17, 4376.15it/s] | |
31%|███ | 36820/118287 [00:09<00:20, 3918.07it/s] | |
40%|███▉ | 47050/118287 [00:10<00:15, 4674.12it/s] | |
31%|███▏ | 37119/118287 [00:09<00:20, 3886.57it/s] | |
40%|████ | 47693/118287 [00:10<00:14, 4756.77it/s] | |
16%|█▋ | 19359/118287 [00:05<00:28, 3433.84it/s] | |
38%|███▊ | 44847/118287 [00:10<00:17, 4120.86it/s] | |
37%|███▋ | 44249/118287 [00:10<00:18, 3978.64it/s] | |
37%|███▋ | 43582/118287 [00:10<00:17, 4366.11it/s] | |
31%|███▏ | 37214/118287 [00:09<00:20, 3889.89it/s] | |
40%|████ | 47519/118287 [00:10<00:15, 4660.21it/s] | |
32%|███▏ | 37510/118287 [00:09<00:20, 3887.15it/s] | |
41%|████ | 48170/118287 [00:10<00:15, 4642.20it/s] | |
17%|█▋ | 19703/118287 [00:05<00:28, 3429.28it/s] | |
38%|███▊ | 45263/118287 [00:10<00:17, 4132.34it/s] | |
38%|███▊ | 44650/118287 [00:10<00:18, 3987.95it/s] | |
37%|███▋ | 44019/118287 [00:10<00:17, 4334.02it/s] | |
32%|███▏ | 37611/118287 [00:09<00:20, 3913.45it/s] | |
41%|████ | 47986/118287 [00:10<00:15, 4616.55it/s] | |
32%|███▏ | 37900/118287 [00:09<00:21, 3791.14it/s] | |
41%|████ | 48663/118287 [00:10<00:14, 4721.03it/s] | |
17%|█▋ | 20049/118287 [00:05<00:28, 3437.67it/s] | |
39%|███▊ | 45677/118287 [00:10<00:17, 4083.13it/s] | |
38%|███▊ | 45079/118287 [00:10<00:17, 4072.75it/s] | |
38%|███▊ | 44463/118287 [00:10<00:16, 4361.59it/s] | |
32%|███▏ | 38004/118287 [00:09<00:20, 3854.42it/s] | |
41%|████ | 48457/118287 [00:10<00:15, 4642.55it/s] | |
42%|████▏ | 49149/118287 [00:10<00:14, 4761.76it/s] | |
32%|███▏ | 38281/118287 [00:09<00:21, 3779.93it/s] | |
17%|█▋ | 20393/118287 [00:05<00:28, 3426.93it/s] | |
39%|███▉ | 46095/118287 [00:10<00:17, 4110.27it/s] | |
38%|███▊ | 44922/118287 [00:10<00:16, 4425.79it/s] | |
38%|███▊ | 45487/118287 [00:10<00:18, 3985.35it/s] | |
32%|███▏ | 38411/118287 [00:09<00:20, 3914.16it/s] | |
41%|████▏ | 48943/118287 [00:10<00:14, 4704.76it/s] | |
33%|███▎ | 38688/118287 [00:09<00:20, 3862.40it/s] | |
42%|████▏ | 49627/118287 [00:10<00:14, 4634.31it/s] | |
18%|█▊ | 20736/118287 [00:05<00:28, 3397.20it/s] | |
39%|███▉ | 46532/118287 [00:11<00:17, 4183.24it/s] | |
38%|███▊ | 45365/118287 [00:10<00:16, 4417.47it/s] | |
39%|███▉ | 45903/118287 [00:10<00:17, 4035.27it/s] | |
33%|███▎ | 38823/118287 [00:09<00:19, 3973.23it/s] | |
42%|████▏ | 49414/118287 [00:10<00:14, 4614.89it/s] | |
33%|███▎ | 39088/118287 [00:09<00:20, 3897.86it/s] | |
42%|████▏ | 50095/118287 [00:10<00:14, 4647.75it/s] | |
18%|█▊ | 21076/118287 [00:06<00:29, 3338.96it/s] | |
40%|███▉ | 46956/118287 [00:11<00:16, 4199.86it/s] | |
39%|███▉ | 46332/118287 [00:10<00:17, 4107.38it/s] | |
39%|███▊ | 45808/118287 [00:10<00:16, 4368.88it/s] | |
33%|███▎ | 39222/118287 [00:10<00:20, 3855.07it/s] | |
42%|████▏ | 49877/118287 [00:10<00:15, 4540.38it/s] | |
33%|███▎ | 39494/118287 [00:10<00:19, 3940.94it/s] | |
43%|████▎ | 50561/118287 [00:10<00:14, 4649.83it/s] | |
18%|█▊ | 21471/118287 [00:06<00:27, 3500.74it/s] | |
40%|████ | 47377/118287 [00:11<00:17, 4162.49it/s] | |
40%|███▉ | 46768/118287 [00:10<00:17, 4177.28it/s] | |
39%|███▉ | 46278/118287 [00:10<00:16, 4462.70it/s] | |
33%|███▎ | 39620/118287 [00:10<00:20, 3888.38it/s] | |
43%|████▎ | 50351/118287 [00:10<00:14, 4595.96it/s] | |
34%|███▎ | 39909/118287 [00:10<00:19, 4000.43it/s] | |
43%|████▎ | 51044/118287 [00:10<00:14, 4699.15it/s] | |
40%|████ | 47794/118287 [00:11<00:16, 4164.12it/s] | |
18%|█▊ | 21824/118287 [00:06<00:27, 3470.75it/s] | |
40%|███▉ | 46752/118287 [00:11<00:15, 4538.46it/s] | |
40%|███▉ | 47187/118287 [00:10<00:17, 4092.57it/s] | |
34%|███▍ | 40021/118287 [00:10<00:19, 3923.87it/s] | |
43%|████▎ | 50812/118287 [00:11<00:14, 4573.55it/s] | |
34%|███▍ | 40310/118287 [00:10<00:19, 3910.98it/s] | |
44%|████▎ | 51538/118287 [00:11<00:14, 4765.43it/s] | |
19%|█▊ | 22173/118287 [00:06<00:27, 3472.57it/s] | |
41%|████ | 48211/118287 [00:11<00:16, 4129.72it/s] | |
40%|███▉ | 47207/118287 [00:11<00:15, 4472.00it/s] | |
40%|████ | 47618/118287 [00:11<00:17, 4154.12it/s] | |
34%|███▍ | 40415/118287 [00:10<00:20, 3880.19it/s] | |
43%|████▎ | 51283/118287 [00:11<00:14, 4612.24it/s] | |
34%|███▍ | 40703/118287 [00:10<00:19, 3889.64it/s] | |
44%|████▍ | 52016/118287 [00:11<00:13, 4734.53it/s] | |
19%|█▉ | 22540/118287 [00:06<00:27, 3524.88it/s] | |
41%|████ | 48661/118287 [00:11<00:16, 4227.59it/s] | |
40%|████ | 47667/118287 [00:11<00:15, 4503.16it/s] | |
41%|████ | 48035/118287 [00:11<00:17, 4086.47it/s] | |
34%|███▍ | 40804/118287 [00:10<00:20, 3826.48it/s] | |
44%|████▎ | 51745/118287 [00:11<00:14, 4599.44it/s] | |
35%|███▍ | 41110/118287 [00:10<00:19, 3941.09it/s] | |
44%|████▍ | 52490/118287 [00:11<00:14, 4652.77it/s] | |
19%|█▉ | 22903/118287 [00:06<00:26, 3554.60it/s] | |
41%|████▏ | 49088/118287 [00:11<00:16, 4239.62it/s] | |
41%|████ | 48453/118287 [00:11<00:16, 4112.68it/s] | |
41%|████ | 48118/118287 [00:11<00:15, 4411.49it/s] | |
35%|███▍ | 41189/118287 [00:10<00:20, 3825.49it/s] | |
44%|████▍ | 52206/118287 [00:11<00:14, 4519.17it/s] | |
35%|███▌ | 41505/118287 [00:10<00:19, 3892.50it/s] | |
45%|████▍ | 52956/118287 [00:11<00:14, 4522.30it/s] | |
20%|█▉ | 23287/118287 [00:06<00:26, 3634.23it/s] | |
42%|████▏ | 49513/118287 [00:11<00:16, 4135.20it/s] | |
41%|████▏ | 48896/118287 [00:11<00:16, 4199.07it/s] | |
41%|████ | 48578/118287 [00:11<00:15, 4462.57it/s] | |
35%|███▌ | 41586/118287 [00:10<00:19, 3866.27it/s] | |
45%|████▍ | 52664/118287 [00:11<00:14, 4534.62it/s] | |
35%|███▌ | 41896/118287 [00:10<00:19, 3896.61it/s] | |
45%|████▌ | 53430/118287 [00:11<00:14, 4584.37it/s] | |
20%|█▉ | 23652/118287 [00:06<00:26, 3574.06it/s] | |
42%|████▏ | 49938/118287 [00:11<00:16, 4168.72it/s] | |
41%|████▏ | 49043/118287 [00:11<00:15, 4508.00it/s] | |
42%|████▏ | 49317/118287 [00:11<00:16, 4131.08it/s] | |
35%|███▌ | 41974/118287 [00:10<00:20, 3785.38it/s] | |
45%|████▍ | 53118/118287 [00:11<00:14, 4453.14it/s] | |
36%|███▌ | 42287/118287 [00:10<00:19, 3850.86it/s] | |
46%|████▌ | 53933/118287 [00:11<00:13, 4703.71it/s] | |
20%|██ | 24026/118287 [00:06<00:26, 3618.13it/s] | |
43%|████▎ | 50372/118287 [00:11<00:16, 4213.89it/s] | |
42%|████▏ | 49495/118287 [00:11<00:15, 4427.71it/s] | |
42%|████▏ | 49731/118287 [00:11<00:17, 4017.39it/s] | |
36%|███▌ | 42354/118287 [00:10<00:20, 3779.67it/s] | |
45%|████▌ | 53594/118287 [00:11<00:14, 4540.46it/s] | |
36%|███▌ | 42683/118287 [00:10<00:19, 3882.71it/s] | |
46%|████▌ | 54405/118287 [00:11<00:13, 4682.79it/s] | |
21%|██ | 24427/118287 [00:06<00:25, 3724.37it/s] | |
43%|████▎ | 50795/118287 [00:12<00:16, 4153.31it/s] | |
42%|████▏ | 49940/118287 [00:11<00:15, 4433.31it/s] | |
42%|████▏ | 50148/118287 [00:11<00:16, 4056.19it/s] | |
36%|███▌ | 42735/118287 [00:11<00:19, 3781.11it/s] | |
46%|████▌ | 54083/118287 [00:11<00:13, 4637.42it/s] | |
36%|███▋ | 43072/118287 [00:10<00:19, 3869.47it/s] | |
46%|████▋ | 54875/118287 [00:11<00:13, 4645.16it/s] | |
21%|██ | 24812/118287 [00:07<00:24, 3759.84it/s] | |
43%|████▎ | 51246/118287 [00:12<00:15, 4252.73it/s] | |
43%|████▎ | 50393/118287 [00:11<00:15, 4461.36it/s] | |
43%|████▎ | 50555/118287 [00:11<00:16, 4052.59it/s] | |
36%|███▋ | 43118/118287 [00:11<00:19, 3795.62it/s] | |
46%|████▌ | 54548/118287 [00:11<00:13, 4636.18it/s] | |
37%|███▋ | 43460/118287 [00:11<00:19, 3838.90it/s] | |
47%|████▋ | 55347/118287 [00:11<00:13, 4665.15it/s] | |
21%|██▏ | 25189/118287 [00:07<00:24, 3733.75it/s] | |
44%|████▎ | 51682/118287 [00:12<00:15, 4283.09it/s] | |
43%|████▎ | 50840/118287 [00:11<00:15, 4446.06it/s] | |
43%|████▎ | 50984/118287 [00:11<00:16, 4113.26it/s] | |
37%|███▋ | 43498/118287 [00:11<00:20, 3729.78it/s] | |
47%|████▋ | 55014/118287 [00:12<00:13, 4639.29it/s] | |
37%|███▋ | 43845/118287 [00:11<00:19, 3833.41it/s] | |
47%|████▋ | 55815/118287 [00:11<00:13, 4621.28it/s] | |
22%|██▏ | 25565/118287 [00:07<00:24, 3740.97it/s] | |
44%|████▍ | 52112/118287 [00:12<00:15, 4224.84it/s] | |
43%|████▎ | 51301/118287 [00:12<00:14, 4492.83it/s] | |
43%|████▎ | 51408/118287 [00:11<00:16, 4146.89it/s] | |
37%|███▋ | 43872/118287 [00:11<00:20, 3703.40it/s] | |
47%|████▋ | 55479/118287 [00:12<00:13, 4600.20it/s] | |
37%|███▋ | 44237/118287 [00:11<00:19, 3858.89it/s] | |
48%|████▊ | 56278/118287 [00:12<00:13, 4544.21it/s] | |
22%|██▏ | 25946/118287 [00:07<00:24, 3759.80it/s] | |
44%|████▍ | 52536/118287 [00:12<00:15, 4218.26it/s] | |
44%|████▍ | 51751/118287 [00:12<00:14, 4454.90it/s] | |
44%|████▍ | 51824/118287 [00:12<00:16, 4102.14it/s] | |
37%|███▋ | 44243/118287 [00:11<00:20, 3700.43it/s] | |
47%|████▋ | 55940/118287 [00:12<00:13, 4557.24it/s] | |
38%|███▊ | 44624/118287 [00:11<00:19, 3818.86it/s] | |
48%|████▊ | 56737/118287 [00:12<00:13, 4557.05it/s] | |
22%|██▏ | 26338/118287 [00:07<00:24, 3803.62it/s] | |
45%|████▍ | 52959/118287 [00:12<00:15, 4096.96it/s] | |
44%|████▍ | 52235/118287 [00:12<00:16, 4070.56it/s] | |
44%|████▍ | 52197/118287 [00:12<00:15, 4300.47it/s] | |
38%|███▊ | 44614/118287 [00:11<00:20, 3667.41it/s] | |
48%|████▊ | 56397/118287 [00:12<00:13, 4500.64it/s] | |
38%|███▊ | 45045/118287 [00:11<00:18, 3926.47it/s] | |
48%|████▊ | 57239/118287 [00:12<00:13, 4685.44it/s] | |
23%|██▎ | 26734/118287 [00:07<00:23, 3846.04it/s] | |
45%|████▌ | 53394/118287 [00:12<00:15, 4168.26it/s] | |
45%|████▍ | 52645/118287 [00:12<00:16, 4073.77it/s] | |
44%|████▍ | 52629/118287 [00:12<00:15, 4250.88it/s] | |
38%|███▊ | 45003/118287 [00:11<00:19, 3728.27it/s] | |
48%|████▊ | 56872/118287 [00:12<00:13, 4571.93it/s] | |
38%|███▊ | 45439/118287 [00:11<00:18, 3868.07it/s] | |
49%|████▉ | 57735/118287 [00:12<00:12, 4760.81it/s] | |
23%|██▎ | 27119/118287 [00:07<00:23, 3826.22it/s] | |
46%|████▌ | 53855/118287 [00:12<00:15, 4291.16it/s] | |
45%|████▍ | 53053/118287 [00:12<00:16, 3965.56it/s] | |
45%|████▍ | 53056/118287 [00:12<00:15, 4104.65it/s] | |
38%|███▊ | 45377/118287 [00:11<00:19, 3708.50it/s] | |
48%|████▊ | 57361/118287 [00:12<00:13, 4661.18it/s] | |
39%|███▊ | 45832/118287 [00:11<00:18, 3886.11it/s] | |
49%|████▉ | 58213/118287 [00:12<00:12, 4709.98it/s] | |
23%|██▎ | 27516/118287 [00:07<00:23, 3867.96it/s] | |
46%|████▌ | 54286/118287 [00:12<00:14, 4282.57it/s] | |
45%|████▌ | 53478/118287 [00:12<00:16, 4040.94it/s] | |
45%|████▌ | 53474/118287 [00:12<00:15, 4126.91it/s] | |
49%|████▉ | 57847/118287 [00:12<00:12, 4718.09it/s] | |
39%|███▊ | 45749/118287 [00:11<00:20, 3622.69it/s] | |
39%|███▉ | 46244/118287 [00:11<00:18, 3950.52it/s] | |
50%|████▉ | 58685/118287 [00:12<00:12, 4684.86it/s] | |
24%|██▎ | 27904/118287 [00:07<00:23, 3806.52it/s] | |
46%|████▋ | 54716/118287 [00:13<00:14, 4280.24it/s] | |
46%|████▌ | 53935/118287 [00:12<00:15, 4185.34it/s] | |
46%|████▌ | 53908/118287 [00:12<00:15, 4188.33it/s] | |
49%|████▉ | 58320/118287 [00:12<00:12, 4706.13it/s] | |
39%|███▉ | 46122/118287 [00:11<00:19, 3653.94it/s] | |
39%|███▉ | 46662/118287 [00:11<00:17, 4016.47it/s] | |
50%|█████ | 59155/118287 [00:12<00:12, 4689.09it/s] | |
24%|██▍ | 28286/118287 [00:07<00:23, 3804.45it/s] | |
47%|████▋ | 55145/118287 [00:13<00:14, 4260.17it/s] | |
46%|████▌ | 54356/118287 [00:12<00:15, 4159.21it/s] | |
46%|████▌ | 54328/118287 [00:12<00:15, 4096.94it/s] | |
39%|███▉ | 46518/118287 [00:12<00:19, 3738.42it/s] | |
50%|████▉ | 58792/118287 [00:12<00:12, 4605.03it/s] | |
40%|███▉ | 47065/118287 [00:11<00:17, 3972.16it/s] | |
50%|█████ | 59625/118287 [00:12<00:12, 4605.32it/s] | |
24%|██▍ | 28679/118287 [00:08<00:23, 3840.61it/s] | |
47%|████▋ | 55574/118287 [00:13<00:14, 4268.90it/s] | |
46%|████▋ | 54784/118287 [00:12<00:15, 4194.68it/s] | |
46%|████▋ | 54739/118287 [00:12<00:15, 4093.73it/s] | |
40%|███▉ | 46893/118287 [00:12<00:19, 3713.07it/s] | |
50%|█████ | 59263/118287 [00:12<00:12, 4633.35it/s] | |
40%|████ | 47463/118287 [00:12<00:17, 3941.04it/s] | |
51%|█████ | 60087/118287 [00:12<00:12, 4498.74it/s] | |
25%|██▍ | 29064/118287 [00:08<00:23, 3840.09it/s] | |
47%|████▋ | 56002/118287 [00:13<00:14, 4213.97it/s] | |
47%|████▋ | 55215/118287 [00:12<00:14, 4224.75it/s] | |
47%|████▋ | 55150/118287 [00:12<00:15, 4074.61it/s] | |
50%|█████ | 59727/118287 [00:13<00:12, 4585.53it/s] | |
40%|███▉ | 47265/118287 [00:12<00:19, 3643.89it/s] | |
40%|████ | 47858/118287 [00:12<00:17, 3926.32it/s] | |
25%|██▍ | 29451/118287 [00:08<00:23, 3848.65it/s] | |
51%|█████ | 60538/118287 [00:12<00:13, 4410.30it/s] | |
48%|████▊ | 56424/118287 [00:13<00:14, 4144.46it/s] | |
47%|████▋ | 55639/118287 [00:12<00:15, 4146.57it/s] | |
47%|████▋ | 55562/118287 [00:13<00:15, 4084.73it/s] | |
40%|████ | 47637/118287 [00:12<00:19, 3662.29it/s] | |
51%|█████ | 60187/118287 [00:13<00:12, 4511.28it/s] | |
41%|████ | 48251/118287 [00:12<00:17, 3923.51it/s] | |
25%|██▌ | 29837/118287 [00:08<00:22, 3848.74it/s] | |
52%|█████▏ | 60993/118287 [00:13<00:12, 4448.18it/s] | |
48%|████▊ | 56861/118287 [00:13<00:14, 4209.34it/s] | |
47%|████▋ | 56056/118287 [00:13<00:14, 4151.69it/s] | |
47%|████▋ | 55971/118287 [00:13<00:15, 4032.01it/s] | |
41%|████ | 48662/118287 [00:12<00:17, 3975.17it/s] | |
41%|████ | 48004/118287 [00:12<00:19, 3587.08it/s] | |
51%|█████▏ | 60639/118287 [00:13<00:13, 4387.37it/s] | |
52%|█████▏ | 61439/118287 [00:13<00:12, 4378.55it/s] | |
26%|██▌ | 30222/118287 [00:08<00:23, 3749.80it/s] | |
48%|████▊ | 57317/118287 [00:13<00:14, 4308.68it/s] | |
48%|████▊ | 56472/118287 [00:13<00:14, 4127.68it/s] | |
48%|████▊ | 56375/118287 [00:13<00:15, 3977.37it/s] | |
41%|████▏ | 49068/118287 [00:12<00:17, 3995.67it/s] | |
41%|████ | 48372/118287 [00:12<00:19, 3613.73it/s] | |
52%|█████▏ | 61091/118287 [00:13<00:12, 4424.97it/s] | |
52%|█████▏ | 61922/118287 [00:13<00:12, 4504.53it/s] | |
26%|██▌ | 30648/118287 [00:08<00:22, 3885.71it/s] | |
49%|████▉ | 57757/118287 [00:13<00:13, 4334.21it/s] | |
48%|████▊ | 56907/118287 [00:13<00:14, 4190.68it/s] | |
48%|████▊ | 56804/118287 [00:13<00:15, 4062.22it/s] | |
41%|████ | 48761/118287 [00:12<00:18, 3692.30it/s] | |
42%|████▏ | 49468/118287 [00:12<00:17, 3902.50it/s] | |
52%|█████▏ | 61535/118287 [00:13<00:12, 4416.43it/s] | |
26%|██▌ | 31047/118287 [00:08<00:22, 3911.88it/s] | |
53%|█████▎ | 62374/118287 [00:13<00:12, 4437.78it/s] | |
49%|████▉ | 58192/118287 [00:13<00:14, 4290.07it/s] | |
48%|████▊ | 57353/118287 [00:13<00:14, 4266.31it/s] | |
48%|████▊ | 57252/118287 [00:13<00:14, 4175.88it/s] | |
42%|████▏ | 49134/118287 [00:12<00:18, 3703.49it/s] | |
52%|█████▏ | 62019/118287 [00:13<00:12, 4533.91it/s] | |
42%|████▏ | 49859/118287 [00:12<00:17, 3870.11it/s] | |
27%|██▋ | 31446/118287 [00:08<00:22, 3933.66it/s] | |
53%|█████▎ | 62850/118287 [00:13<00:12, 4528.32it/s] | |
50%|████▉ | 58628/118287 [00:13<00:13, 4308.07it/s] | |
49%|████▉ | 57804/118287 [00:13<00:13, 4331.91it/s] | |
49%|████▉ | 57697/118287 [00:13<00:14, 4252.45it/s] | |
42%|████▏ | 50247/118287 [00:12<00:17, 3872.91it/s] | |
42%|████▏ | 49505/118287 [00:12<00:19, 3589.23it/s] | |
53%|█████▎ | 62474/118287 [00:13<00:12, 4424.60it/s] | |
54%|█████▎ | 63318/118287 [00:13<00:12, 4569.03it/s] | |
27%|██▋ | 31841/118287 [00:08<00:22, 3880.62it/s] | |
50%|████▉ | 59060/118287 [00:14<00:13, 4262.75it/s] | |
49%|████▉ | 58238/118287 [00:13<00:13, 4306.53it/s] | |
49%|████▉ | 58124/118287 [00:13<00:14, 4199.51it/s] | |
43%|████▎ | 50638/118287 [00:12<00:17, 3880.10it/s] | |
42%|████▏ | 49873/118287 [00:12<00:18, 3611.38it/s] | |
53%|█████▎ | 62960/118287 [00:13<00:12, 4545.96it/s] | |
54%|█████▍ | 63776/118287 [00:13<00:12, 4539.51it/s] | |
27%|██▋ | 32230/118287 [00:08<00:22, 3846.25it/s] | |
50%|█████ | 59487/118287 [00:14<00:13, 4201.76it/s] | |
50%|████▉ | 58670/118287 [00:13<00:13, 4297.18it/s] | |
50%|████▉ | 58559/118287 [00:13<00:14, 4243.05it/s] | |
43%|████▎ | 51041/118287 [00:12<00:17, 3923.28it/s] | |
42%|████▏ | 50250/118287 [00:13<00:18, 3656.13it/s] | |
54%|█████▎ | 63417/118287 [00:13<00:12, 4405.67it/s] | |
54%|█████▍ | 64232/118287 [00:13<00:11, 4544.75it/s] | |
28%|██▊ | 32629/118287 [00:09<00:22, 3887.92it/s] | |
51%|█████ | 59908/118287 [00:14<00:13, 4200.78it/s] | |
50%|████▉ | 59101/118287 [00:13<00:13, 4275.60it/s] | |
50%|████▉ | 58985/118287 [00:13<00:14, 4087.47it/s] | |
44%|████▎ | 51464/118287 [00:13<00:16, 4004.60it/s] | |
43%|████▎ | 50621/118287 [00:13<00:18, 3669.12it/s] | |
54%|█████▍ | 63860/118287 [00:13<00:12, 4404.33it/s] | |
55%|█████▍ | 64721/118287 [00:13<00:11, 4642.21it/s] | |
28%|██▊ | 33019/118287 [00:09<00:21, 3883.09it/s] | |
51%|█████ | 60329/118287 [00:14<00:13, 4145.60it/s] | |
50%|█████ | 59529/118287 [00:13<00:13, 4226.95it/s] | |
50%|█████ | 59398/118287 [00:14<00:14, 4097.68it/s] | |
43%|████▎ | 51008/118287 [00:13<00:18, 3721.95it/s] | |
44%|████▍ | 51866/118287 [00:13<00:16, 3932.67it/s] | |
54%|█████▍ | 64325/118287 [00:14<00:12, 4474.70it/s] | |
28%|██▊ | 33434/118287 [00:09<00:21, 3954.89it/s] | |
55%|█████▌ | 65187/118287 [00:13<00:11, 4612.18it/s] | |
51%|█████▏ | 60744/118287 [00:14<00:14, 4107.71it/s] | |
51%|█████ | 59953/118287 [00:14<00:14, 4166.39it/s] | |
51%|█████ | 59810/118287 [00:14<00:14, 4064.97it/s] | |
43%|████▎ | 51406/118287 [00:13<00:17, 3790.60it/s] | |
44%|████▍ | 52269/118287 [00:13<00:16, 3958.06it/s] | |
55%|█████▍ | 64788/118287 [00:14<00:11, 4519.67it/s] | |
29%|██▊ | 33831/118287 [00:09<00:21, 3959.21it/s] | |
55%|█████▌ | 65649/118287 [00:14<00:11, 4497.52it/s] | |
52%|█████▏ | 61156/118287 [00:14<00:14, 4053.40it/s] | |
51%|█████ | 60371/118287 [00:14<00:14, 4091.94it/s] | |
51%|█████ | 60218/118287 [00:14<00:14, 4028.19it/s] | |
44%|████▍ | 51793/118287 [00:13<00:17, 3806.34it/s] | |
45%|████▍ | 52666/118287 [00:13<00:16, 3930.19it/s] | |
55%|█████▌ | 65242/118287 [00:14<00:11, 4501.08it/s] | |
29%|██▉ | 34246/118287 [00:09<00:20, 4013.09it/s] | |
56%|█████▌ | 66100/118287 [00:14<00:12, 4334.65it/s] | |
52%|█████▏ | 61578/118287 [00:14<00:13, 4101.23it/s] | |
51%|█████▏ | 60787/118287 [00:14<00:13, 4108.85it/s] | |
51%|█████ | 60622/118287 [00:14<00:14, 3887.73it/s] | |
44%|████▍ | 52175/118287 [00:13<00:17, 3734.94it/s] | |
45%|████▍ | 53060/118287 [00:13<00:16, 3857.51it/s] | |
56%|█████▌ | 65693/118287 [00:14<00:11, 4393.31it/s] | |
29%|██▉ | 34697/118287 [00:09<00:20, 4147.04it/s] | |
56%|█████▌ | 66536/118287 [00:14<00:12, 4264.39it/s] | |
52%|█████▏ | 62023/118287 [00:14<00:13, 4199.85it/s] | |
52%|█████▏ | 61199/118287 [00:14<00:14, 4036.29it/s] | |
52%|█████▏ | 61034/118287 [00:14<00:14, 3954.01it/s] | |
44%|████▍ | 52565/118287 [00:13<00:17, 3775.29it/s] | |
45%|████▌ | 53473/118287 [00:13<00:16, 3929.58it/s] | |
56%|█████▌ | 66134/118287 [00:14<00:12, 4306.53it/s] | |
30%|██▉ | 35114/118287 [00:09<00:20, 3978.28it/s] | |
57%|█████▋ | 66978/118287 [00:14<00:12, 4274.58it/s] | |
53%|█████▎ | 62444/118287 [00:14<00:13, 4092.46it/s] | |
52%|█████▏ | 61621/118287 [00:14<00:13, 4087.93it/s] | |
46%|████▌ | 53918/118287 [00:13<00:15, 4071.57it/s] | |
52%|█████▏ | 61431/118287 [00:14<00:14, 3862.02it/s] | |
45%|████▍ | 52944/118287 [00:13<00:17, 3685.30it/s] | |
56%|█████▋ | 66582/118287 [00:14<00:11, 4354.61it/s] | |
30%|███ | 35524/118287 [00:09<00:20, 4011.75it/s] | |
57%|█████▋ | 67407/118287 [00:14<00:12, 4216.48it/s] | |
53%|█████▎ | 62872/118287 [00:14<00:13, 4143.65it/s] | |
52%|█████▏ | 62057/118287 [00:14<00:13, 4154.49it/s] | |
52%|█████▏ | 61848/118287 [00:14<00:14, 3948.53it/s] | |
46%|████▌ | 54327/118287 [00:13<00:15, 4053.61it/s] | |
45%|████▌ | 53332/118287 [00:13<00:17, 3734.42it/s] | |
57%|█████▋ | 67044/118287 [00:14<00:11, 4429.15it/s] | |
30%|███ | 35946/118287 [00:09<00:20, 4071.81it/s] | |
57%|█████▋ | 67830/118287 [00:14<00:12, 4142.44it/s] | |
54%|█████▎ | 63302/118287 [00:15<00:13, 4187.10it/s] | |
53%|█████▎ | 62474/118287 [00:14<00:14, 3971.75it/s] | |
53%|█████▎ | 62248/118287 [00:14<00:14, 3958.84it/s] | |
46%|████▋ | 54743/118287 [00:13<00:15, 4081.64it/s] | |
45%|████▌ | 53730/118287 [00:13<00:16, 3798.59it/s] | |
57%|█████▋ | 67488/118287 [00:14<00:11, 4376.26it/s] | |
31%|███ | 36355/118287 [00:10<00:20, 4021.63it/s] | |
54%|█████▍ | 63722/118287 [00:15<00:13, 4155.77it/s] | |
58%|█████▊ | 68246/118287 [00:14<00:12, 3961.54it/s] | |
53%|█████▎ | 62886/118287 [00:14<00:13, 4014.68it/s] | |
46%|████▌ | 54149/118287 [00:14<00:16, 3904.36it/s] | |
53%|█████▎ | 62645/118287 [00:14<00:14, 3916.82it/s] | |
47%|████▋ | 55153/118287 [00:13<00:15, 4051.03it/s] | |
57%|█████▋ | 67939/118287 [00:14<00:11, 4414.41it/s] | |
31%|███ | 36759/118287 [00:10<00:20, 3952.13it/s] | |
54%|█████▍ | 64139/118287 [00:15<00:13, 4114.99it/s] | |
58%|█████▊ | 68645/118287 [00:14<00:12, 3904.72it/s] | |
54%|█████▎ | 63290/118287 [00:14<00:13, 3992.23it/s] | |
46%|████▌ | 54541/118287 [00:14<00:16, 3880.56it/s] | |
47%|████▋ | 55563/118287 [00:14<00:15, 4060.29it/s] | |
53%|█████▎ | 63058/118287 [00:14<00:13, 3970.82it/s] | |
58%|█████▊ | 68382/118287 [00:15<00:11, 4356.15it/s] | |
31%|███▏ | 37156/118287 [00:10<00:20, 3898.63it/s] | |
55%|█████▍ | 64617/118287 [00:15<00:12, 4286.85it/s] | |
58%|█████▊ | 69056/118287 [00:14<00:12, 3949.93it/s] | |
54%|█████▍ | 63691/118287 [00:14<00:13, 3929.00it/s] | |
46%|████▋ | 54930/118287 [00:14<00:16, 3872.46it/s] | |
54%|█████▎ | 63456/118287 [00:15<00:13, 3948.67it/s] | |
47%|████▋ | 55970/118287 [00:14<00:15, 4005.52it/s] | |
58%|█████▊ | 68819/118287 [00:15<00:11, 4328.76it/s] | |
32%|███▏ | 37547/118287 [00:10<00:20, 3900.92it/s] | |
55%|█████▍ | 65048/118287 [00:15<00:12, 4292.73it/s] | |
59%|█████▊ | 69453/118287 [00:15<00:12, 3859.01it/s] | |
54%|█████▍ | 64085/118287 [00:15<00:14, 3839.24it/s] | |
47%|████▋ | 55318/118287 [00:14<00:16, 3872.00it/s] | |
54%|█████▍ | 63852/118287 [00:15<00:13, 3915.29it/s] | |
48%|████▊ | 56372/118287 [00:14<00:15, 3949.27it/s] | |
59%|█████▊ | 69260/118287 [00:15<00:11, 4347.18it/s] | |
32%|███▏ | 37938/118287 [00:10<00:20, 3839.86it/s] | |
55%|█████▌ | 65479/118287 [00:15<00:12, 4227.71it/s] | |
59%|█████▉ | 69868/118287 [00:15<00:12, 3940.61it/s] | |
55%|█████▍ | 64519/118287 [00:15<00:13, 3975.98it/s] | |
47%|████▋ | 55706/118287 [00:14<00:16, 3834.55it/s] | |
54%|█████▍ | 64273/118287 [00:15<00:13, 3998.54it/s] | |
48%|████▊ | 56784/118287 [00:14<00:15, 3996.90it/s] | |
59%|█████▉ | 69711/118287 [00:15<00:11, 4394.16it/s] | |
32%|███▏ | 38326/118287 [00:10<00:20, 3850.20it/s] | |
56%|█████▌ | 65904/118287 [00:15<00:12, 4122.59it/s] | |
59%|█████▉ | 70321/118287 [00:15<00:11, 4097.67it/s] | |
55%|█████▍ | 64946/118287 [00:15<00:13, 4059.66it/s] | |
55%|█████▍ | 64683/118287 [00:15<00:13, 4026.84it/s] | |
47%|████▋ | 56090/118287 [00:14<00:16, 3797.27it/s] | |
48%|████▊ | 57207/118287 [00:14<00:15, 4060.31it/s] | |
59%|█████▉ | 70191/118287 [00:15<00:10, 4507.03it/s] | |
33%|███▎ | 38724/118287 [00:10<00:20, 3886.07it/s] | |
56%|█████▌ | 66318/118287 [00:15<00:12, 4071.63it/s] | |
60%|█████▉ | 70736/118287 [00:15<00:11, 4112.32it/s] | |
55%|█████▌ | 65354/118287 [00:15<00:13, 4042.16it/s] | |
48%|████▊ | 56471/118287 [00:14<00:16, 3800.64it/s] | |
55%|█████▌ | 65087/118287 [00:15<00:13, 3983.34it/s] | |
49%|████▊ | 57618/118287 [00:14<00:14, 4074.69it/s] | |
60%|█████▉ | 70671/118287 [00:15<00:10, 4590.38it/s] | |
33%|███▎ | 39114/118287 [00:10<00:20, 3882.44it/s] | |
56%|█████▋ | 66741/118287 [00:15<00:12, 4114.38it/s] | |
60%|██████ | 71159/118287 [00:15<00:11, 4141.11it/s] | |
56%|█████▌ | 65760/118287 [00:15<00:13, 3981.12it/s] | |
48%|████▊ | 56869/118287 [00:14<00:15, 3851.89it/s] | |
49%|████▉ | 58026/118287 [00:14<00:14, 4057.89it/s] | |
55%|█████▌ | 65486/118287 [00:15<00:13, 3919.33it/s] | |
60%|██████ | 71150/118287 [00:15<00:10, 4646.71it/s] | |
33%|███▎ | 39514/118287 [00:10<00:20, 3916.61it/s] | |
57%|█████▋ | 67193/118287 [00:15<00:12, 4227.15it/s] | |
61%|██████ | 71575/118287 [00:15<00:11, 4000.97it/s] | |
48%|████▊ | 57277/118287 [00:14<00:15, 3916.85it/s] | |
56%|█████▌ | 66160/118287 [00:15<00:13, 3896.27it/s] | |
49%|████▉ | 58433/118287 [00:14<00:14, 4039.02it/s] | |
56%|█████▌ | 65879/118287 [00:15<00:13, 3850.73it/s] | |
61%|██████ | 71616/118287 [00:15<00:10, 4498.24it/s] | |
34%|███▎ | 39917/118287 [00:10<00:19, 3948.81it/s] | |
57%|█████▋ | 67618/118287 [00:16<00:12, 4081.51it/s] | |
61%|██████ | 71993/118287 [00:15<00:11, 4052.06it/s] | |
49%|████▉ | 57696/118287 [00:15<00:15, 3992.02it/s] | |
56%|█████▋ | 66554/118287 [00:15<00:13, 3908.13it/s] | |
50%|████▉ | 58838/118287 [00:14<00:15, 3908.95it/s] | |
56%|█████▌ | 66265/118287 [00:15<00:13, 3795.80it/s] | |
61%|██████ | 72084/118287 [00:15<00:10, 4550.24it/s] | |
34%|███▍ | 40313/118287 [00:11<00:20, 3855.64it/s] | |
58%|█████▊ | 68029/118287 [00:16<00:12, 4026.91it/s] | |
61%|██████ | 72400/118287 [00:15<00:11, 4004.44it/s] | |
57%|█████▋ | 66959/118287 [00:15<00:12, 3948.85it/s] | |
49%|████▉ | 58096/118287 [00:15<00:15, 3939.90it/s] | |
50%|█████ | 59239/118287 [00:15<00:14, 3938.62it/s] | |
56%|█████▋ | 66666/118287 [00:15<00:13, 3849.41it/s] | |
61%|██████▏ | 72541/118287 [00:15<00:10, 4518.82it/s] | |
34%|███▍ | 40702/118287 [00:11<00:20, 3865.84it/s] | |
62%|██████▏ | 72808/118287 [00:15<00:11, 4026.03it/s] | |
58%|█████▊ | 68434/118287 [00:16<00:12, 3942.08it/s] | |
57%|█████▋ | 67361/118287 [00:15<00:12, 3967.68it/s] | |
49%|████▉ | 58507/118287 [00:15<00:14, 3989.39it/s] | |
57%|█████▋ | 67082/118287 [00:15<00:13, 3937.29it/s] | |
50%|█████ | 59634/118287 [00:15<00:15, 3830.23it/s] | |
62%|██████▏ | 73012/118287 [00:16<00:09, 4573.77it/s] | |
35%|███▍ | 41118/118287 [00:11<00:19, 3947.22it/s] | |
62%|██████▏ | 73241/118287 [00:15<00:10, 4111.90it/s] | |
58%|█████▊ | 68830/118287 [00:16<00:12, 3887.92it/s] | |
57%|█████▋ | 67759/118287 [00:15<00:12, 3940.55it/s] | |
50%|████▉ | 58907/118287 [00:15<00:15, 3853.32it/s] | |
57%|█████▋ | 67477/118287 [00:16<00:13, 3881.85it/s] | |
62%|██████▏ | 73471/118287 [00:16<00:09, 4564.31it/s] | |
51%|█████ | 60019/118287 [00:15<00:15, 3718.89it/s] | |
35%|███▌ | 41517/118287 [00:11<00:19, 3956.16it/s] | |
62%|██████▏ | 73654/118287 [00:16<00:10, 4080.93it/s] | |
59%|█████▊ | 69221/118287 [00:16<00:12, 3868.76it/s] | |
58%|█████▊ | 68154/118287 [00:16<00:12, 3912.44it/s] | |
50%|█████ | 59313/118287 [00:15<00:15, 3910.50it/s] | |
57%|█████▋ | 67886/118287 [00:16<00:12, 3941.95it/s] | |
63%|██████▎ | 73954/118287 [00:16<00:09, 4637.88it/s] | |
51%|█████ | 60393/118287 [00:15<00:15, 3698.05it/s] | |
35%|███▌ | 41924/118287 [00:11<00:19, 3989.59it/s] | |
63%|██████▎ | 74076/118287 [00:16<00:10, 4120.42it/s] | |
59%|█████▉ | 69609/118287 [00:16<00:12, 3855.45it/s] | |
58%|█████▊ | 68546/118287 [00:16<00:12, 3860.75it/s] | |
50%|█████ | 59706/118287 [00:15<00:15, 3857.69it/s] | |
58%|█████▊ | 68282/118287 [00:16<00:12, 3866.36it/s] | |
63%|██████▎ | 74419/118287 [00:16<00:09, 4612.92it/s] | |
51%|█████▏ | 60764/118287 [00:15<00:15, 3696.60it/s] | |
36%|███▌ | 42324/118287 [00:11<00:19, 3970.08it/s] | |
63%|██████▎ | 74490/118287 [00:16<00:10, 4115.53it/s] | |
59%|█████▉ | 70004/118287 [00:16<00:12, 3880.74it/s] | |
58%|█████▊ | 68945/118287 [00:16<00:12, 3898.45it/s] | |
51%|█████ | 60093/118287 [00:15<00:15, 3810.26it/s] | |
58%|█████▊ | 68670/118287 [00:16<00:12, 3855.98it/s] | |
63%|██████▎ | 74881/118287 [00:16<00:09, 4517.19it/s] | |
52%|█████▏ | 61135/118287 [00:15<00:15, 3667.30it/s] | |
36%|███▌ | 42738/118287 [00:11<00:18, 4014.87it/s] | |
63%|██████▎ | 74903/118287 [00:16<00:10, 4117.24it/s] | |
60%|█████▉ | 70432/118287 [00:16<00:11, 3988.42it/s] | |
59%|█████▊ | 69336/118287 [00:16<00:12, 3888.44it/s] | |
58%|█████▊ | 69071/118287 [00:16<00:12, 3899.39it/s] | |
51%|█████ | 60475/118287 [00:15<00:15, 3723.31it/s] | |
64%|██████▎ | 75335/118287 [00:16<00:09, 4520.47it/s] | |
52%|█████▏ | 61505/118287 [00:15<00:15, 3675.52it/s] | |
36%|███▋ | 43156/118287 [00:11<00:18, 4062.76it/s] | |
64%|██████▎ | 75316/118287 [00:16<00:10, 4080.76it/s] | |
60%|█████▉ | 70838/118287 [00:16<00:11, 4009.02it/s] | |
59%|█████▉ | 69745/118287 [00:16<00:12, 3946.26it/s] | |
51%|█████▏ | 60868/118287 [00:15<00:15, 3782.38it/s] | |
59%|█████▊ | 69462/118287 [00:16<00:12, 3872.11it/s] | |
52%|█████▏ | 61905/118287 [00:15<00:14, 3763.48it/s] | |
64%|██████▍ | 75788/118287 [00:16<00:09, 4459.32it/s] | |
37%|███▋ | 43563/118287 [00:11<00:18, 4046.65it/s] | |
64%|██████▍ | 75725/118287 [00:16<00:10, 4068.29it/s] | |
60%|██████ | 71250/118287 [00:17<00:11, 4040.69it/s] | |
59%|█████▉ | 70170/118287 [00:16<00:11, 4032.13it/s] | |
59%|█████▉ | 69880/118287 [00:16<00:12, 3958.16it/s] | |
52%|█████▏ | 61248/118287 [00:15<00:15, 3715.05it/s] | |
53%|█████▎ | 62283/118287 [00:15<00:14, 3767.30it/s] | |
64%|██████▍ | 76247/118287 [00:16<00:09, 4493.70it/s] | |
37%|███▋ | 43968/118287 [00:11<00:18, 4039.04it/s] | |
64%|██████▍ | 76141/118287 [00:16<00:10, 4093.83it/s] | |
60%|█████▉ | 70615/118287 [00:16<00:11, 4148.33it/s] | |
61%|██████ | 71655/118287 [00:17<00:11, 3900.61it/s] | |
59%|█████▉ | 70331/118287 [00:16<00:11, 4108.11it/s] | |
52%|█████▏ | 61632/118287 [00:16<00:15, 3750.13it/s] | |
53%|█████▎ | 62661/118287 [00:15<00:14, 3759.77it/s] | |
65%|██████▍ | 76697/118287 [00:16<00:09, 4492.40it/s] | |
38%|███▊ | 44373/118287 [00:12<00:18, 4011.68it/s] | |
65%|██████▍ | 76551/118287 [00:16<00:10, 4055.04it/s] | |
60%|██████ | 71037/118287 [00:16<00:11, 4168.33it/s] | |
61%|██████ | 72063/118287 [00:17<00:11, 3952.67it/s] | |
60%|█████▉ | 70765/118287 [00:16<00:11, 4169.39it/s] | |
52%|█████▏ | 62039/118287 [00:16<00:14, 3839.95it/s] | |
53%|█████▎ | 63058/118287 [00:16<00:14, 3813.77it/s] | |
65%|██████▌ | 77152/118287 [00:16<00:09, 4508.76it/s] | |
38%|███▊ | 44802/118287 [00:12<00:17, 4085.95it/s] | |
65%|██████▌ | 76957/118287 [00:16<00:10, 4035.48it/s] | |
61%|██████▏ | 72460/118287 [00:17<00:11, 3933.17it/s] | |
60%|██████ | 71455/118287 [00:16<00:11, 4054.28it/s] | |
60%|██████ | 71192/118287 [00:16<00:11, 4198.17it/s] | |
54%|█████▎ | 63440/118287 [00:16<00:14, 3790.09it/s] | |
66%|██████▌ | 77623/118287 [00:17<00:08, 4561.13it/s] | |
53%|█████▎ | 62425/118287 [00:16<00:15, 3712.04it/s] | |
38%|███▊ | 45212/118287 [00:12<00:17, 4080.60it/s] | |
65%|██████▌ | 77372/118287 [00:16<00:10, 4067.45it/s] | |
62%|██████▏ | 72881/118287 [00:17<00:11, 4010.96it/s] | |
61%|██████ | 71862/118287 [00:16<00:11, 4050.08it/s] | |
54%|█████▍ | 63824/118287 [00:16<00:14, 3804.83it/s] | |
61%|██████ | 71613/118287 [00:17<00:11, 4055.50it/s] | |
66%|██████▌ | 78095/118287 [00:17<00:08, 4604.37it/s] | |
53%|█████▎ | 62834/118287 [00:16<00:14, 3815.92it/s] | |
39%|███▊ | 45621/118287 [00:12<00:17, 4050.56it/s] | |
66%|██████▌ | 77800/118287 [00:17<00:09, 4128.02it/s] | |
62%|██████▏ | 73295/118287 [00:17<00:11, 4041.77it/s] | |
61%|██████ | 72277/118287 [00:17<00:11, 4076.79it/s] | |
61%|██████ | 72024/118287 [00:17<00:11, 4070.63it/s] | |
54%|█████▍ | 64208/118287 [00:16<00:14, 3812.81it/s] | |
66%|██████▋ | 78567/118287 [00:17<00:08, 4638.27it/s] | |
53%|█████▎ | 63237/118287 [00:16<00:14, 3876.05it/s] | |
39%|███▉ | 46054/118287 [00:12<00:17, 4128.68it/s] | |
66%|██████▌ | 78223/118287 [00:17<00:09, 4157.42it/s] | |
62%|██████▏ | 73722/118287 [00:17<00:10, 4106.66it/s] | |
61%|██████▏ | 72686/118287 [00:17<00:11, 4010.76it/s] | |
55%|█████▍ | 64621/118287 [00:16<00:13, 3900.24it/s] | |
61%|██████ | 72433/118287 [00:17<00:11, 4071.42it/s] | |
67%|██████▋ | 79032/118287 [00:17<00:08, 4571.80it/s] | |
39%|███▉ | 46498/118287 [00:12<00:17, 4206.22it/s] | |
54%|█████▍ | 63627/118287 [00:16<00:14, 3769.35it/s] | |
66%|██████▋ | 78640/118287 [00:17<00:09, 4123.41it/s] | |
63%|██████▎ | 74144/118287 [00:17<00:10, 4140.01it/s] | |
62%|██████▏ | 73117/118287 [00:17<00:11, 4095.16it/s] | |
55%|█████▍ | 65018/118287 [00:16<00:13, 3917.65it/s] | |
62%|██████▏ | 72852/118287 [00:17<00:11, 4094.86it/s] | |
67%|██████▋ | 79516/118287 [00:17<00:08, 4648.01it/s] | |
40%|███▉ | 46924/118287 [00:12<00:16, 4219.69it/s] | |
54%|█████▍ | 64013/118287 [00:16<00:14, 3796.03it/s] | |
67%|██████▋ | 79053/118287 [00:17<00:09, 4078.03it/s] | |
63%|██████▎ | 74559/118287 [00:17<00:10, 4089.98it/s] | |
62%|██████▏ | 73528/118287 [00:17<00:11, 4008.38it/s] | |
62%|██████▏ | 73283/118287 [00:17<00:10, 4156.14it/s] | |
68%|██████▊ | 79982/118287 [00:17<00:08, 4642.71it/s] | |
55%|█████▌ | 65411/118287 [00:16<00:13, 3815.46it/s] | |
40%|████ | 47347/118287 [00:12<00:17, 4166.31it/s] | |
54%|█████▍ | 64425/118287 [00:16<00:13, 3882.93it/s] | |
67%|██████▋ | 79480/118287 [00:17<00:09, 4132.31it/s] | |
63%|██████▎ | 74974/118287 [00:17<00:10, 4104.02it/s] | |
63%|██████▎ | 73970/118287 [00:17<00:10, 4121.04it/s] | |
62%|██████▏ | 73700/118287 [00:17<00:10, 4134.34it/s] | |
68%|██████▊ | 80447/118287 [00:17<00:08, 4582.46it/s] | |
56%|█████▌ | 65794/118287 [00:16<00:13, 3775.17it/s] | |
40%|████ | 47771/118287 [00:12<00:16, 4185.11it/s] | |
55%|█████▍ | 64825/118287 [00:16<00:13, 3910.33it/s] | |
68%|██████▊ | 79894/118287 [00:17<00:09, 4120.77it/s] | |
64%|██████▎ | 75385/118287 [00:18<00:10, 4085.90it/s] | |
63%|██████▎ | 74384/118287 [00:17<00:10, 4047.52it/s] | |
63%|██████▎ | 74122/118287 [00:17<00:10, 4150.35it/s] | |
68%|██████▊ | 80906/118287 [00:17<00:08, 4535.30it/s] | |
55%|█████▌ | 65218/118287 [00:16<00:13, 3881.11it/s] | |
41%|████ | 48190/118287 [00:12<00:16, 4133.90it/s] | |
56%|█████▌ | 66173/118287 [00:16<00:14, 3695.03it/s] | |
68%|██████▊ | 80307/118287 [00:17<00:09, 4109.95it/s] | |
64%|██████▍ | 75794/118287 [00:18<00:10, 4037.53it/s] | |
63%|██████▎ | 74807/118287 [00:17<00:10, 4099.29it/s] | |
63%|██████▎ | 74538/118287 [00:17<00:10, 4127.09it/s] | |
69%|██████▉ | 81360/118287 [00:17<00:08, 4497.92it/s] | |
41%|████ | 48631/118287 [00:13<00:16, 4210.94it/s] | |
56%|█████▋ | 66556/118287 [00:16<00:13, 3733.12it/s] | |
55%|█████▌ | 65607/118287 [00:17<00:13, 3791.79it/s] | |
68%|██████▊ | 80719/118287 [00:17<00:09, 4055.95it/s] | |
64%|██████▍ | 76219/118287 [00:18<00:10, 4098.91it/s] | |
64%|██████▎ | 75219/118287 [00:17<00:10, 4057.03it/s] | |
63%|██████▎ | 74959/118287 [00:17<00:10, 4151.16it/s] | |
69%|██████▉ | 81818/118287 [00:17<00:08, 4518.55it/s] | |
41%|████▏ | 49068/118287 [00:13<00:16, 4249.80it/s] | |
57%|█████▋ | 66951/118287 [00:17<00:13, 3795.27it/s] | |
56%|█████▌ | 65988/118287 [00:17<00:14, 3708.15it/s] | |
69%|██████▊ | 81127/118287 [00:17<00:09, 4063.04it/s] | |
65%|██████▍ | 76630/118287 [00:18<00:10, 4046.47it/s] | |
64%|██████▍ | 75626/118287 [00:17<00:10, 4019.03it/s] | |
64%|██████▎ | 75375/118287 [00:18<00:10, 4100.45it/s] | |
70%|██████▉ | 82276/118287 [00:18<00:07, 4532.43it/s] | |
57%|█████▋ | 67347/118287 [00:17<00:13, 3842.79it/s] | |
42%|████▏ | 49494/118287 [00:13<00:16, 4162.66it/s] | |
56%|█████▌ | 66360/118287 [00:17<00:14, 3691.54it/s] | |
69%|██████▉ | 81534/118287 [00:18<00:09, 4036.37it/s] | |
65%|██████▌ | 77036/118287 [00:18<00:10, 4022.35it/s] | |
64%|██████▍ | 76029/118287 [00:18<00:10, 4007.60it/s] | |
64%|██████▍ | 75786/118287 [00:18<00:10, 4068.56it/s] | |
70%|██████▉ | 82754/118287 [00:18<00:07, 4600.63it/s] | |
42%|████▏ | 49912/118287 [00:13<00:16, 4164.95it/s] | |
57%|█████▋ | 67732/118287 [00:17<00:13, 3759.12it/s] | |
56%|█████▋ | 66737/118287 [00:17<00:13, 3713.06it/s] | |
69%|██████▉ | 81942/118287 [00:18<00:08, 4047.49it/s] | |
65%|██████▌ | 77460/118287 [00:18<00:10, 4081.86it/s] | |
65%|██████▍ | 76431/118287 [00:18<00:10, 4000.20it/s] | |
64%|██████▍ | 76209/118287 [00:18<00:10, 4115.33it/s] | |
70%|███████ | 83257/118287 [00:18<00:07, 4719.68it/s] | |
43%|████▎ | 50349/118287 [00:13<00:16, 4223.83it/s] | |
58%|█████▊ | 68113/118287 [00:17<00:13, 3773.23it/s] | |
57%|█████▋ | 67152/118287 [00:17<00:13, 3833.88it/s] | |
70%|██████▉ | 82376/118287 [00:18<00:08, 4127.95it/s] | |
66%|██████▌ | 77885/118287 [00:18<00:09, 4127.44it/s] | |
65%|██████▍ | 76832/118287 [00:18<00:10, 3967.06it/s] | |
65%|██████▍ | 76621/118287 [00:18<00:10, 4093.54it/s] | |
71%|███████ | 83731/118287 [00:18<00:07, 4661.98it/s] | |
43%|████▎ | 50772/118287 [00:13<00:16, 4176.06it/s] | |
58%|█████▊ | 68491/118287 [00:17<00:13, 3691.19it/s] | |
57%|█████▋ | 67537/118287 [00:17<00:13, 3709.10it/s] | |
70%|███████ | 82812/118287 [00:18<00:08, 4194.90it/s] | |
66%|██████▌ | 78318/118287 [00:18<00:09, 4185.65it/s] | |
65%|██████▌ | 77270/118287 [00:18<00:10, 4078.68it/s] | |
65%|██████▌ | 77031/118287 [00:18<00:10, 4069.84it/s] | |
71%|███████ | 84226/118287 [00:18<00:07, 4744.35it/s] | |
43%|████▎ | 51237/118287 [00:13<00:15, 4306.61it/s] | |
58%|█████▊ | 68879/118287 [00:17<00:13, 3743.64it/s] | |
57%|█████▋ | 67933/118287 [00:17<00:13, 3776.75it/s] | |
70%|███████ | 83260/118287 [00:18<00:08, 4275.26it/s] | |
67%|██████▋ | 78738/118287 [00:18<00:09, 4128.94it/s] | |
66%|██████▌ | 77688/118287 [00:18<00:09, 4106.38it/s] | |
65%|██████▌ | 77462/118287 [00:18<00:09, 4138.70it/s] | |
72%|███████▏ | 84702/118287 [00:18<00:07, 4666.42it/s] | |
44%|████▎ | 51673/118287 [00:13<00:15, 4321.99it/s] | |
59%|█████▊ | 69255/118287 [00:17<00:13, 3712.40it/s] | |
58%|█████▊ | 68313/118287 [00:17<00:13, 3719.83it/s] | |
71%|███████ | 83689/118287 [00:18<00:08, 4232.72it/s] | |
67%|██████▋ | 79161/118287 [00:18<00:09, 4154.89it/s] | |
66%|██████▌ | 78123/118287 [00:18<00:09, 4174.98it/s] | |
66%|██████▌ | 77893/118287 [00:18<00:09, 4185.38it/s] | |
72%|███████▏ | 85170/118287 [00:18<00:07, 4632.78it/s] | |
44%|████▍ | 52107/118287 [00:13<00:15, 4256.26it/s] | |
59%|█████▉ | 69645/118287 [00:17<00:12, 3765.73it/s] | |
58%|█████▊ | 68687/118287 [00:17<00:13, 3707.93it/s] | |
71%|███████ | 84135/118287 [00:18<00:07, 4297.62it/s] | |
67%|██████▋ | 79584/118287 [00:19<00:09, 4171.07it/s] | |
66%|██████▋ | 78544/118287 [00:18<00:09, 4184.96it/s] | |
66%|██████▌ | 78337/118287 [00:18<00:09, 4256.99it/s] | |
72%|███████▏ | 85650/118287 [00:18<00:06, 4674.57it/s] | |
44%|████▍ | 52547/118287 [00:13<00:15, 4298.09it/s] | |
59%|█████▉ | 70056/118287 [00:17<00:12, 3862.48it/s] | |
58%|█████▊ | 69068/118287 [00:18<00:13, 3736.63it/s] | |
71%|███████▏ | 84566/118287 [00:18<00:07, 4245.75it/s] | |
68%|██████▊ | 80002/118287 [00:19<00:09, 4125.09it/s] | |
67%|██████▋ | 78964/118287 [00:18<00:09, 4124.22it/s] | |
67%|██████▋ | 78764/118287 [00:18<00:09, 4173.90it/s] | |
73%|███████▎ | 86119/118287 [00:18<00:06, 4627.05it/s] | |
60%|█████▉ | 70495/118287 [00:17<00:11, 4004.58it/s] | |
45%|████▍ | 52978/118287 [00:14<00:15, 4178.78it/s] | |
59%|█████▊ | 69443/118287 [00:18<00:13, 3691.22it/s] | |
72%|███████▏ | 84992/118287 [00:18<00:07, 4237.96it/s] | |
68%|██████▊ | 80415/118287 [00:19<00:09, 4073.64it/s] | |
67%|██████▋ | 79409/118287 [00:18<00:09, 4213.74it/s] | |
67%|██████▋ | 79190/118287 [00:18<00:09, 4199.27it/s] | |
73%|███████▎ | 86587/118287 [00:18<00:06, 4637.30it/s] | |
60%|█████▉ | 70898/118287 [00:18<00:11, 3996.19it/s] | |
45%|████▌ | 53416/118287 [00:14<00:15, 4235.48it/s] | |
59%|█████▉ | 69828/118287 [00:18<00:12, 3736.49it/s] | |
72%|███████▏ | 85417/118287 [00:18<00:07, 4237.91it/s] | |
68%|██████▊ | 80823/118287 [00:19<00:09, 4023.45it/s] | |
67%|██████▋ | 79840/118287 [00:18<00:09, 4238.31it/s] | |
67%|██████▋ | 79620/118287 [00:19<00:09, 4228.75it/s] | |
60%|██████ | 71300/118287 [00:18<00:11, 3963.18it/s] | |
74%|███████▎ | 87052/118287 [00:19<00:06, 4504.90it/s] | |
46%|████▌ | 53878/118287 [00:14<00:14, 4343.37it/s] | |
59%|█████▉ | 70259/118287 [00:18<00:12, 3891.32it/s] | |
73%|███████▎ | 85842/118287 [00:19<00:07, 4234.99it/s] | |
69%|██████▊ | 81226/118287 [00:19<00:09, 4019.58it/s] | |
68%|██████▊ | 80266/118287 [00:19<00:08, 4242.98it/s] | |
68%|██████▊ | 80048/118287 [00:19<00:09, 4242.62it/s] | |
61%|██████ | 71698/118287 [00:18<00:11, 3934.93it/s] | |
46%|████▌ | 54317/118287 [00:14<00:14, 4356.47it/s] | |
74%|███████▍ | 87504/118287 [00:19<00:06, 4468.17it/s] | |
60%|█████▉ | 70664/118287 [00:18<00:12, 3935.69it/s] | |
73%|███████▎ | 86266/118287 [00:19<00:07, 4229.45it/s] | |
69%|██████▉ | 81629/118287 [00:19<00:09, 4010.70it/s] | |
68%|██████▊ | 80691/118287 [00:19<00:09, 4108.92it/s] | |
68%|██████▊ | 80473/118287 [00:19<00:09, 4115.06it/s] | |
61%|██████ | 72107/118287 [00:18<00:11, 3979.31it/s] | |
46%|████▋ | 54759/118287 [00:14<00:14, 4368.27it/s] | |
74%|███████▍ | 87967/118287 [00:19<00:06, 4511.78it/s] | |
60%|██████ | 71074/118287 [00:18<00:11, 3983.28it/s] | |
73%|███████▎ | 86690/118287 [00:19<00:07, 4195.66it/s] | |
69%|██████▉ | 82048/118287 [00:19<00:08, 4060.27it/s] | |
69%|██████▊ | 81106/118287 [00:19<00:09, 4118.83it/s] | |
68%|██████▊ | 80886/118287 [00:19<00:09, 4081.57it/s] | |
47%|████▋ | 55197/118287 [00:14<00:14, 4369.74it/s] | |
75%|███████▍ | 88438/118287 [00:19<00:06, 4565.20it/s] | |
61%|██████▏ | 72506/118287 [00:18<00:11, 3910.33it/s] | |
60%|██████ | 71474/118287 [00:18<00:12, 3887.99it/s] | |
74%|███████▎ | 87110/118287 [00:19<00:07, 4110.56it/s] | |
70%|██████▉ | 82489/118287 [00:19<00:08, 4158.26it/s] | |
69%|██████▉ | 81523/118287 [00:19<00:08, 4133.73it/s] | |
69%|██████▊ | 81295/118287 [00:19<00:09, 4081.61it/s] | |
75%|███████▌ | 88896/118287 [00:19<00:06, 4557.39it/s] | |
47%|████▋ | 55635/118287 [00:14<00:14, 4329.21it/s] | |
62%|██████▏ | 72922/118287 [00:18<00:11, 3981.92it/s] | |
61%|██████ | 71868/118287 [00:18<00:11, 3902.45it/s] | |
74%|███████▍ | 87522/118287 [00:19<00:07, 4056.84it/s] | |
70%|███████ | 82936/118287 [00:19<00:08, 4245.21it/s] | |
69%|██████▉ | 81937/118287 [00:19<00:08, 4086.88it/s] | |
69%|██████▉ | 81704/118287 [00:19<00:09, 4055.70it/s] | |
76%|███████▌ | 89377/118287 [00:19<00:06, 4629.88it/s] | |
62%|██████▏ | 73331/118287 [00:18<00:11, 4009.98it/s] | |
47%|████▋ | 56069/118287 [00:14<00:14, 4244.95it/s] | |
61%|██████ | 72268/118287 [00:18<00:11, 3928.17it/s] | |
74%|███████▍ | 87950/118287 [00:19<00:07, 4115.67it/s] | |
71%|███████ | 83400/118287 [00:19<00:08, 4356.04it/s] | |
70%|██████▉ | 82371/118287 [00:19<00:08, 4151.56it/s] | |
69%|██████▉ | 82116/118287 [00:19<00:08, 4074.30it/s] | |
76%|███████▌ | 89841/118287 [00:19<00:06, 4627.27it/s] | |
62%|██████▏ | 73752/118287 [00:18<00:10, 4063.82it/s] | |
48%|████▊ | 56495/118287 [00:14<00:14, 4188.68it/s] | |
61%|██████▏ | 72662/118287 [00:18<00:11, 3879.65it/s] | |
75%|███████▍ | 88363/118287 [00:19<00:07, 4111.43it/s] | |
71%|███████ | 83838/118287 [00:20<00:07, 4308.83it/s] | |
70%|███████ | 82811/118287 [00:19<00:08, 4213.80it/s] | |
70%|██████▉ | 82537/118287 [00:19<00:08, 4112.52it/s] | |
76%|███████▋ | 90313/118287 [00:19<00:06, 4653.89it/s] | |
63%|██████▎ | 74167/118287 [00:18<00:10, 4088.96it/s] | |
48%|████▊ | 56921/118287 [00:15<00:14, 4208.90it/s] | |
62%|██████▏ | 73072/118287 [00:19<00:11, 3940.71it/s] | |
75%|███████▌ | 88775/118287 [00:19<00:07, 4092.17it/s] | |
71%|███████ | 84270/118287 [00:20<00:07, 4263.53it/s] | |
70%|███████ | 83266/118287 [00:19<00:08, 4307.27it/s] | |
70%|███████ | 82980/118287 [00:19<00:08, 4200.09it/s] | |
77%|███████▋ | 90779/118287 [00:19<00:05, 4606.81it/s] | |
63%|██████▎ | 74577/118287 [00:19<00:10, 4029.38it/s] | |
48%|████▊ | 57349/118287 [00:15<00:14, 4220.11it/s] | |
62%|██████▏ | 73467/118287 [00:19<00:11, 3893.71it/s] | |
75%|███████▌ | 89191/118287 [00:19<00:07, 4111.86it/s] | |
72%|███████▏ | 84698/118287 [00:20<00:07, 4265.15it/s] | |
71%|███████ | 83698/118287 [00:19<00:08, 4219.72it/s] | |
71%|███████ | 83431/118287 [00:19<00:08, 4288.47it/s] | |
77%|███████▋ | 91241/118287 [00:19<00:05, 4598.80it/s] | |
63%|██████▎ | 74986/118287 [00:19<00:10, 4044.20it/s] | |
49%|████▉ | 57792/118287 [00:15<00:14, 4273.33it/s] | |
62%|██████▏ | 73892/118287 [00:19<00:11, 3993.78it/s] | |
76%|███████▌ | 89663/118287 [00:19<00:06, 4275.21it/s] | |
72%|███████▏ | 85126/118287 [00:20<00:07, 4240.38it/s] | |
71%|███████ | 84138/118287 [00:19<00:07, 4272.08it/s] | |
71%|███████ | 83861/118287 [00:20<00:08, 4275.91it/s] | |
78%|███████▊ | 91702/118287 [00:20<00:05, 4532.14it/s] | |
64%|██████▎ | 75391/118287 [00:19<00:10, 4022.90it/s] | |
49%|████▉ | 58220/118287 [00:15<00:14, 4145.58it/s] | |
63%|██████▎ | 74293/118287 [00:19<00:11, 3986.85it/s] | |
76%|███████▌ | 90093/118287 [00:20<00:06, 4169.78it/s] | |
72%|███████▏ | 85571/118287 [00:20<00:07, 4298.71it/s] | |
71%|███████▏ | 84290/118287 [00:20<00:07, 4250.91it/s] | |
71%|███████▏ | 84567/118287 [00:20<00:08, 4203.04it/s] | |
78%|███████▊ | 92170/118287 [00:20<00:05, 4575.04it/s] | |
64%|██████▍ | 75794/118287 [00:19<00:10, 3957.25it/s] | |
50%|████▉ | 58637/118287 [00:15<00:14, 4151.50it/s] | |
63%|██████▎ | 74693/118287 [00:19<00:11, 3951.43it/s] | |
77%|███████▋ | 90512/118287 [00:20<00:06, 4133.13it/s] | |
73%|███████▎ | 86002/118287 [00:20<00:07, 4233.83it/s] | |
72%|███████▏ | 84716/118287 [00:20<00:07, 4249.99it/s] | |
72%|███████▏ | 85002/118287 [00:20<00:07, 4240.13it/s] | |
78%|███████▊ | 92628/118287 [00:20<00:05, 4533.92it/s] | |
64%|██████▍ | 76209/118287 [00:19<00:10, 4013.12it/s] | |
63%|██████▎ | 75089/118287 [00:19<00:10, 3953.72it/s] | |
50%|████▉ | 59053/118287 [00:15<00:14, 4008.36it/s] | |
77%|███████▋ | 90941/118287 [00:20<00:06, 4173.80it/s] | |
73%|███████▎ | 86426/118287 [00:20<00:07, 4235.21it/s] | |
72%|███████▏ | 85142/118287 [00:20<00:07, 4228.64it/s] | |
72%|███████▏ | 85427/118287 [00:20<00:07, 4215.14it/s] | |
79%|███████▊ | 93082/118287 [00:20<00:05, 4523.35it/s] | |
65%|██████▍ | 76611/118287 [00:19<00:10, 3975.79it/s] | |
50%|█████ | 59456/118287 [00:15<00:14, 3985.44it/s] | |
64%|██████▍ | 75485/118287 [00:19<00:11, 3861.14it/s] | |
77%|███████▋ | 91366/118287 [00:20<00:06, 4193.65it/s] | |
73%|███████▎ | 86850/118287 [00:20<00:07, 4229.81it/s] | |
72%|███████▏ | 85585/118287 [00:20<00:07, 4285.76it/s] | |
73%|███████▎ | 85850/118287 [00:20<00:07, 4203.10it/s] | |
79%|███████▉ | 93550/118287 [00:20<00:05, 4566.99it/s] | |
65%|██████▌ | 77010/118287 [00:19<00:10, 3965.76it/s] | |
64%|██████▍ | 75879/118287 [00:19<00:10, 3883.36it/s] | |
51%|█████ | 59856/118287 [00:15<00:15, 3893.98it/s] | |
78%|███████▊ | 91787/118287 [00:20<00:06, 4117.21it/s] | |
74%|███████▍ | 87274/118287 [00:20<00:07, 4208.49it/s] | |
73%|███████▎ | 86014/118287 [00:20<00:07, 4257.51it/s] | |
73%|███████▎ | 86278/118287 [00:20<00:07, 4221.53it/s] | |
79%|███████▉ | 94025/118287 [00:20<00:05, 4618.21it/s] | |
65%|██████▌ | 77423/118287 [00:19<00:10, 4010.83it/s] | |
64%|██████▍ | 76277/118287 [00:19<00:10, 3911.84it/s] | |
51%|█████ | 60247/118287 [00:15<00:15, 3851.22it/s] | |
78%|███████▊ | 92216/118287 [00:20<00:06, 4167.52it/s] | |
74%|███████▍ | 87696/118287 [00:20<00:07, 4155.59it/s] | |
73%|███████▎ | 86441/118287 [00:20<00:07, 4247.21it/s] | |
73%|███████▎ | 86704/118287 [00:20<00:07, 4229.01it/s] | |
80%|███████▉ | 94488/118287 [00:20<00:05, 4556.63it/s] | |
66%|██████▌ | 77840/118287 [00:19<00:09, 4057.01it/s] | |
65%|██████▍ | 76669/118287 [00:19<00:10, 3904.08it/s] | |
51%|█████▏ | 60634/118287 [00:15<00:15, 3712.14it/s] | |
78%|███████▊ | 92634/118287 [00:20<00:06, 4128.42it/s] | |
75%|███████▍ | 88135/118287 [00:21<00:07, 4223.23it/s] | |
73%|███████▎ | 86866/118287 [00:20<00:07, 4229.62it/s] | |
74%|███████▎ | 87128/118287 [00:20<00:07, 4102.01it/s] | |
80%|████████ | 94968/118287 [00:20<00:05, 4626.98it/s] | |
66%|██████▌ | 78273/118287 [00:19<00:09, 4134.61it/s] | |
65%|██████▌ | 77060/118287 [00:20<00:10, 3892.99it/s] | |
52%|█████▏ | 61017/118287 [00:16<00:15, 3742.92it/s] | |
79%|███████▊ | 93048/118287 [00:20<00:06, 4123.10it/s] | |
75%|███████▍ | 88558/118287 [00:21<00:07, 4146.50it/s] | |
74%|███████▍ | 87290/118287 [00:20<00:07, 4224.28it/s] | |
74%|███████▍ | 87540/118287 [00:20<00:07, 4080.73it/s] | |
81%|████████ | 95432/118287 [00:20<00:04, 4585.19it/s] | |
67%|██████▋ | 78688/118287 [00:20<00:09, 4085.11it/s] | |
66%|██████▌ | 77480/118287 [00:20<00:10, 3977.49it/s] | |
52%|█████▏ | 61393/118287 [00:16<00:15, 3647.10it/s] | |
79%|███████▉ | 93469/118287 [00:20<00:05, 4142.16it/s] | |
75%|███████▌ | 88984/118287 [00:21<00:07, 4177.97it/s] | |
74%|███████▍ | 87713/118287 [00:20<00:07, 4189.13it/s] | |
74%|███████▍ | 87962/118287 [00:20<00:07, 4121.15it/s] | |
81%|████████ | 95896/118287 [00:21<00:04, 4600.46it/s] | |
67%|██████▋ | 79104/118287 [00:20<00:09, 4104.78it/s] | |
66%|██████▌ | 77879/118287 [00:20<00:10, 3978.22it/s] | |
52%|█████▏ | 61785/118287 [00:16<00:15, 3724.50it/s] | |
79%|███████▉ | 93889/118287 [00:20<00:05, 4156.71it/s] | |
76%|███████▌ | 89440/118287 [00:21<00:06, 4283.63it/s] | |
75%|███████▍ | 88156/118287 [00:21<00:07, 4258.17it/s] | |
75%|███████▍ | 88392/118287 [00:20<00:07, 4172.69it/s] | |
81%|████████▏ | 96375/118287 [00:21<00:04, 4655.08it/s] | |
67%|██████▋ | 79515/118287 [00:20<00:09, 4102.24it/s] | |
66%|██████▌ | 78306/118287 [00:20<00:09, 4056.62it/s] | |
80%|███████▉ | 94321/118287 [00:21<00:05, 4203.95it/s] | |
53%|█████▎ | 62160/118287 [00:16<00:15, 3724.81it/s] | |
76%|███████▌ | 89870/118287 [00:21<00:06, 4237.42it/s] | |
75%|███████▍ | 88583/118287 [00:21<00:07, 4212.40it/s] | |
75%|███████▌ | 88810/118287 [00:21<00:07, 4115.85it/s] | |
82%|████████▏ | 96850/118287 [00:21<00:04, 4678.49it/s] | |
68%|██████▊ | 79930/118287 [00:20<00:09, 4115.28it/s] | |
67%|██████▋ | 78713/118287 [00:20<00:09, 4007.76it/s] | |
80%|████████ | 94743/118287 [00:21<00:05, 4207.02it/s] | |
53%|█████▎ | 62534/118287 [00:16<00:15, 3607.99it/s] | |
76%|███████▋ | 90306/118287 [00:21<00:06, 4258.01it/s] | |
75%|███████▌ | 89012/118287 [00:21<00:06, 4234.25it/s] | |
75%|███████▌ | 89239/118287 [00:21<00:06, 4163.95it/s] | |
82%|████████▏ | 97321/118287 [00:21<00:04, 4687.83it/s] | |
68%|██████▊ | 80342/118287 [00:20<00:09, 4037.14it/s] | |
67%|██████▋ | 79128/118287 [00:20<00:09, 4044.17it/s] | |
80%|████████ | 95171/118287 [00:21<00:05, 4227.51it/s] | |
53%|█████▎ | 62897/118287 [00:16<00:15, 3544.04it/s] | |
77%|███████▋ | 90733/118287 [00:21<00:06, 4204.05it/s] | |
76%|███████▌ | 89492/118287 [00:21<00:06, 4388.89it/s] | |
76%|███████▌ | 89701/118287 [00:21<00:06, 4287.49it/s] | |
83%|████████▎ | 97791/118287 [00:21<00:04, 4623.48it/s] | |
68%|██████▊ | 80747/118287 [00:20<00:09, 3950.20it/s] | |
67%|██████▋ | 79533/118287 [00:20<00:09, 4035.32it/s] | |
81%|████████ | 95594/118287 [00:21<00:05, 4203.07it/s] | |
53%|█████▎ | 63253/118287 [00:16<00:15, 3533.38it/s] | |
77%|███████▋ | 91154/118287 [00:21<00:06, 4195.16it/s] | |
76%|███████▌ | 89933/118287 [00:21<00:06, 4338.79it/s] | |
76%|███████▌ | 90132/118287 [00:21<00:06, 4261.12it/s] | |
83%|████████▎ | 98281/118287 [00:21<00:04, 4702.49it/s] | |
69%|██████▊ | 81143/118287 [00:20<00:09, 3940.92it/s] | |
68%|██████▊ | 79937/118287 [00:20<00:09, 4025.89it/s] | |
81%|████████ | 96015/118287 [00:21<00:05, 4200.75it/s] | |
77%|███████▋ | 91574/118287 [00:21<00:06, 4155.14it/s] | |
54%|█████▍ | 63608/118287 [00:16<00:16, 3402.44it/s] | |
76%|███████▋ | 90376/118287 [00:21<00:06, 4364.04it/s] | |
77%|███████▋ | 90560/118287 [00:21<00:06, 4163.41it/s] | |
83%|████████▎ | 98752/118287 [00:21<00:04, 4618.82it/s] | |
69%|██████▉ | 81541/118287 [00:20<00:09, 3952.20it/s] | |
68%|██████▊ | 80340/118287 [00:20<00:09, 3950.30it/s] | |
82%|████████▏ | 96470/118287 [00:21<00:05, 4297.94it/s] | |
54%|█████▍ | 63951/118287 [00:16<00:15, 3406.56it/s] | |
78%|███████▊ | 91990/118287 [00:22<00:06, 4097.01it/s] | |
77%|███████▋ | 90814/118287 [00:21<00:06, 4347.01it/s] | |
77%|███████▋ | 90986/118287 [00:21<00:06, 4185.85it/s] | |
84%|████████▍ | 99215/118287 [00:21<00:04, 4522.94it/s] | |
69%|██████▉ | 81937/118287 [00:20<00:09, 3951.73it/s] | |
68%|██████▊ | 80736/118287 [00:20<00:09, 3898.10it/s] | |
82%|████████▏ | 96901/118287 [00:21<00:05, 4262.65it/s] | |
54%|█████▍ | 64318/118287 [00:16<00:15, 3480.99it/s] | |
78%|███████▊ | 92417/118287 [00:22<00:06, 4146.93it/s] | |
77%|███████▋ | 91250/118287 [00:21<00:06, 4332.48it/s] | |
77%|███████▋ | 91408/118287 [00:21<00:06, 4195.09it/s] | |
84%|████████▍ | 99676/118287 [00:21<00:04, 4546.98it/s] | |
70%|██████▉ | 82351/118287 [00:20<00:08, 4004.02it/s] | |
69%|██████▊ | 81127/118287 [00:21<00:09, 3898.14it/s] | |
82%|████████▏ | 97328/118287 [00:21<00:04, 4253.76it/s] | |
55%|█████▍ | 64684/118287 [00:17<00:15, 3532.68it/s] | |
78%|███████▊ | 92833/118287 [00:22<00:06, 4109.98it/s] | |
78%|███████▊ | 91684/118287 [00:21<00:06, 4291.07it/s] | |
78%|███████▊ | 91829/118287 [00:21<00:06, 4138.87it/s] | |
70%|██████▉ | 82771/118287 [00:21<00:08, 4058.88it/s] | |
85%|████████▍ | 100132/118287 [00:21<00:04, 4504.96it/s] | |
69%|██████▉ | 81518/118287 [00:21<00:09, 3900.98it/s] | |
83%|████████▎ | 97754/118287 [00:21<00:04, 4206.09it/s] | |
55%|█████▍ | 65048/118287 [00:17<00:14, 3561.71it/s] | |
79%|███████▉ | 93245/118287 [00:22<00:06, 4101.07it/s] | |
78%|███████▊ | 92120/118287 [00:21<00:06, 4310.68it/s] | |
78%|███████▊ | 92246/118287 [00:21<00:06, 4142.51it/s] | |
70%|███████ | 83214/118287 [00:21<00:08, 4161.82it/s] | |
85%|████████▌ | 100601/118287 [00:22<00:03, 4554.65it/s] | |
69%|██████▉ | 81913/118287 [00:21<00:09, 3908.35it/s] | |
83%|████████▎ | 98204/118287 [00:21<00:04, 4278.83it/s] | |
55%|█████▌ | 65405/118287 [00:17<00:14, 3534.88it/s] | |
79%|███████▉ | 93677/118287 [00:22<00:05, 4163.52it/s] | |
78%|███████▊ | 92562/118287 [00:22<00:05, 4340.95it/s] | |
78%|███████▊ | 92661/118287 [00:21<00:06, 4124.50it/s] | |
71%|███████ | 83632/118287 [00:21<00:08, 4112.33it/s] | |
85%|████████▌ | 101058/118287 [00:22<00:03, 4516.43it/s] | |
70%|██████▉ | 82318/118287 [00:21<00:09, 3949.67it/s] | |
83%|████████▎ | 98646/118287 [00:22<00:04, 4318.65it/s] | |
56%|█████▌ | 65767/118287 [00:17<00:14, 3559.18it/s] | |
80%|███████▉ | 94118/118287 [00:22<00:05, 4234.42it/s] | |
79%|███████▊ | 92997/118287 [00:22<00:05, 4298.72it/s] | |
79%|███████▊ | 93078/118287 [00:22<00:06, 4136.67it/s] | |
71%|███████ | 84045/118287 [00:21<00:08, 4115.86it/s] | |
86%|████████▌ | 101511/118287 [00:22<00:03, 4448.11it/s] | |
70%|██████▉ | 82714/118287 [00:21<00:09, 3942.39it/s] | |
84%|████████▍ | 99079/118287 [00:22<00:04, 4197.42it/s] | |
56%|█████▌ | 66124/118287 [00:17<00:15, 3472.76it/s] | |
80%|███████▉ | 94543/118287 [00:22<00:05, 4152.92it/s] | |
79%|███████▉ | 93440/118287 [00:22<00:05, 4336.56it/s] | |
79%|███████▉ | 93501/118287 [00:22<00:05, 4162.88it/s] | |
71%|███████▏ | 84463/118287 [00:21<00:08, 4134.14it/s] | |
86%|████████▌ | 101975/118287 [00:22<00:03, 4503.80it/s] | |
70%|███████ | 83129/118287 [00:21<00:08, 3996.50it/s] | |
84%|████████▍ | 99505/118287 [00:22<00:04, 4215.77it/s] | |
56%|█████▌ | 66484/118287 [00:17<00:14, 3508.48it/s] | |
80%|████████ | 94979/118287 [00:22<00:05, 4212.64it/s] | |
79%|███████▉ | 93880/118287 [00:22<00:05, 4343.24it/s] | |
79%|███████▉ | 93918/118287 [00:22<00:05, 4153.06it/s] | |
72%|███████▏ | 84877/118287 [00:21<00:08, 4083.85it/s] | |
87%|████████▋ | 102426/118287 [00:22<00:03, 4452.72it/s] | |
71%|███████ | 83542/118287 [00:21<00:08, 4035.43it/s] | |
84%|████████▍ | 99933/118287 [00:22<00:04, 4232.57it/s] | |
57%|█████▋ | 66841/118287 [00:17<00:14, 3524.55it/s] | |
81%|████████ | 95402/118287 [00:22<00:05, 4202.77it/s] | |
80%|███████▉ | 94328/118287 [00:22<00:05, 4382.76it/s] | |
80%|███████▉ | 94341/118287 [00:22<00:05, 4173.85it/s] | |
72%|███████▏ | 85296/118287 [00:21<00:08, 4110.93it/s] | |
87%|████████▋ | 102872/118287 [00:22<00:03, 4454.01it/s] | |
71%|███████ | 83957/118287 [00:21<00:08, 4066.38it/s] | |
85%|████████▍ | 100357/118287 [00:22<00:04, 4222.26it/s] | |
57%|█████▋ | 67242/118287 [00:17<00:13, 3655.74it/s] | |
81%|████████ | 95828/118287 [00:22<00:05, 4218.99it/s] | |
80%|████████ | 94769/118287 [00:22<00:05, 4389.68it/s] | |
80%|████████ | 94759/118287 [00:22<00:05, 4168.93it/s] | |
72%|███████▏ | 85708/118287 [00:21<00:07, 4083.97it/s] | |
87%|████████▋ | 103336/118287 [00:22<00:03, 4508.22it/s] | |
71%|███████▏ | 84364/118287 [00:21<00:08, 4041.17it/s] | |
85%|████████▌ | 100782/118287 [00:22<00:04, 4230.42it/s] | |
81%|████████▏ | 96251/118287 [00:23<00:05, 4213.40it/s] | |
80%|████████ | 95221/118287 [00:22<00:05, 4427.95it/s] | |
57%|█████▋ | 67610/118287 [00:17<00:14, 3547.75it/s] | |
80%|████████ | 95196/118287 [00:22<00:05, 4225.65it/s] | |
73%|███████▎ | 86117/118287 [00:21<00:07, 4049.39it/s] | |
88%|████████▊ | 103788/118287 [00:22<00:03, 4452.12it/s] | |
72%|███████▏ | 84769/118287 [00:21<00:08, 3996.56it/s] | |
86%|████████▌ | 101206/118287 [00:22<00:04, 4188.21it/s] | |
82%|████████▏ | 96695/118287 [00:23<00:05, 4276.80it/s] | |
81%|████████ | 95664/118287 [00:22<00:05, 4400.85it/s] | |
57%|█████▋ | 67972/118287 [00:18<00:14, 3568.18it/s] | |
81%|████████ | 95619/118287 [00:22<00:05, 4186.02it/s] | |
73%|███████▎ | 86527/118287 [00:21<00:07, 4062.36it/s] | |
88%|████████▊ | 104234/118287 [00:22<00:03, 4407.81it/s] | |
72%|███████▏ | 85169/118287 [00:22<00:08, 3984.64it/s] | |
86%|████████▌ | 101633/118287 [00:22<00:03, 4190.34it/s] | |
82%|████████▏ | 97124/118287 [00:23<00:04, 4266.70it/s] | |
81%|████████ | 96105/118287 [00:22<00:05, 4374.86it/s] | |
58%|█████▊ | 68331/118287 [00:18<00:14, 3565.60it/s] | |
81%|████████ | 96040/118287 [00:22<00:05, 4188.30it/s] | |
73%|███████▎ | 86934/118287 [00:22<00:07, 3989.57it/s] | |
88%|████████▊ | 104681/118287 [00:22<00:03, 4423.47it/s] | |
72%|███████▏ | 85578/118287 [00:22<00:08, 4015.23it/s] | |
86%|████████▋ | 102055/118287 [00:22<00:03, 4192.24it/s] | |
82%|████████▏ | 97551/118287 [00:23<00:04, 4261.73it/s] | |
82%|████████▏ | 96595/118287 [00:22<00:04, 4517.86it/s] | |
58%|█████▊ | 68689/118287 [00:18<00:13, 3566.03it/s] | |
82%|████████▏ | 96510/118287 [00:22<00:05, 4326.38it/s] | |
74%|███████▍ | 87334/118287 [00:22<00:07, 3978.60it/s] | |
89%|████████▉ | 105132/118287 [00:23<00:02, 4448.74it/s] | |
73%|███████▎ | 85980/118287 [00:22<00:08, 3957.83it/s] | |
87%|████████▋ | 102480/118287 [00:22<00:03, 4209.06it/s] | |
83%|████████▎ | 97978/118287 [00:23<00:04, 4227.65it/s] | |
82%|████████▏ | 97051/118287 [00:23<00:04, 4528.56it/s] | |
58%|█████▊ | 69064/118287 [00:18<00:13, 3618.48it/s] | |
82%|████████▏ | 96951/118287 [00:23<00:04, 4347.36it/s] | |
74%|███████▍ | 87733/118287 [00:22<00:07, 3971.11it/s] | |
89%|████████▉ | 105578/118287 [00:23<00:02, 4339.05it/s] | |
73%|███████▎ | 86377/118287 [00:22<00:08, 3955.54it/s] | |
87%|████████▋ | 102902/118287 [00:23<00:03, 4209.02it/s] | |
83%|████████▎ | 98419/118287 [00:23<00:04, 4276.48it/s] | |
82%|████████▏ | 97505/118287 [00:23<00:04, 4516.42it/s] | |
59%|█████▊ | 69427/118287 [00:18<00:13, 3578.42it/s] | |
82%|████████▏ | 97387/118287 [00:23<00:04, 4342.93it/s] | |
75%|███████▍ | 88154/118287 [00:22<00:07, 4033.57it/s] | |
90%|████████▉ | 106048/118287 [00:23<00:02, 4439.33it/s] | |
73%|███████▎ | 86773/118287 [00:22<00:08, 3936.08it/s] | |
87%|████████▋ | 103338/118287 [00:23<00:03, 4252.13it/s] | |
84%|████████▎ | 98847/118287 [00:23<00:04, 4257.27it/s] | |
83%|████████▎ | 97958/118287 [00:23<00:04, 4468.93it/s] | |
59%|█████▉ | 69812/118287 [00:18<00:13, 3647.15it/s] | |
83%|████████▎ | 97822/118287 [00:23<00:04, 4268.96it/s] | |
75%|███████▍ | 88558/118287 [00:22<00:07, 3972.13it/s] | |
90%|█████████ | 106494/118287 [00:23<00:02, 4399.90it/s] | |
74%|███████▎ | 87167/118287 [00:22<00:07, 3891.75it/s] | |
88%|████████▊ | 103764/118287 [00:23<00:03, 4249.28it/s] | |
84%|████████▍ | 99273/118287 [00:23<00:04, 4208.23it/s] | |
83%|████████▎ | 98419/118287 [00:23<00:04, 4508.28it/s] | |
59%|█████▉ | 70226/118287 [00:18<00:12, 3782.19it/s] | |
83%|████████▎ | 98291/118287 [00:23<00:04, 4386.58it/s] | |
75%|███████▌ | 88965/118287 [00:22<00:07, 3998.88it/s] | |
90%|█████████ | 106935/118287 [00:23<00:02, 4370.98it/s] | |
74%|███████▍ | 87557/118287 [00:22<00:07, 3853.91it/s] | |
88%|████████▊ | 104190/118287 [00:23<00:03, 4190.38it/s] | |
84%|████████▎ | 98871/118287 [00:23<00:04, 4506.26it/s] | |
84%|████████▍ | 99695/118287 [00:23<00:04, 4190.76it/s] | |
60%|█████▉ | 70634/118287 [00:18<00:12, 3866.41it/s] | |
83%|████████▎ | 98731/118287 [00:23<00:04, 4374.06it/s] | |
76%|███████▌ | 89407/118287 [00:22<00:07, 4115.89it/s] | |
91%|█████████ | 107410/118287 [00:23<00:02, 4474.90it/s] | |
74%|███████▍ | 87952/118287 [00:22<00:07, 3880.29it/s] | |
88%|████████▊ | 104617/118287 [00:23<00:03, 4210.42it/s] | |
85%|████████▍ | 100115/118287 [00:23<00:04, 4168.13it/s] | |
60%|██████ | 71036/118287 [00:18<00:12, 3909.86it/s] | |
84%|████████▍ | 99322/118287 [00:23<00:04, 4460.34it/s] | |
84%|████████▍ | 99170/118287 [00:23<00:04, 4244.05it/s] | |
76%|███████▌ | 89820/118287 [00:22<00:06, 4096.27it/s] | |
91%|█████████ | 107859/118287 [00:23<00:02, 4424.30it/s] | |
75%|███████▍ | 88350/118287 [00:22<00:07, 3905.99it/s] | |
89%|████████▉ | 105046/118287 [00:23<00:03, 4231.34it/s] | |
85%|████████▌ | 100547/118287 [00:24<00:04, 4210.92it/s] | |
84%|████████▍ | 99769/118287 [00:23<00:04, 4411.56it/s] | |
60%|██████ | 71429/118287 [00:18<00:12, 3834.11it/s] | |
84%|████████▍ | 99614/118287 [00:23<00:04, 4300.93it/s] | |
76%|███████▋ | 90240/118287 [00:22<00:06, 4121.70it/s] | |
92%|█████████▏| 108318/118287 [00:23<00:02, 4470.73it/s] | |
75%|███████▌ | 88741/118287 [00:22<00:07, 3894.90it/s] | |
89%|████████▉ | 105470/118287 [00:23<00:03, 4137.04it/s] | |
85%|████████▌ | 100973/118287 [00:24<00:04, 4224.42it/s] | |
85%|████████▍ | 100225/118287 [00:23<00:04, 4454.05it/s] | |
61%|██████ | 71814/118287 [00:19<00:12, 3825.05it/s] | |
85%|████████▍ | 100046/118287 [00:23<00:04, 4265.80it/s] | |
77%|███████▋ | 90653/118287 [00:22<00:06, 4044.28it/s] | |
75%|███████▌ | 89139/118287 [00:23<00:07, 3919.88it/s] | |
92%|█████████▏| 108766/118287 [00:23<00:02, 4393.44it/s] | |
90%|████████▉ | 105885/118287 [00:23<00:03, 4117.60it/s] | |
85%|████████▌ | 100686/118287 [00:23<00:03, 4499.31it/s] | |
86%|████████▌ | 101396/118287 [00:24<00:04, 4108.63it/s] | |
61%|██████ | 72198/118287 [00:19<00:12, 3822.52it/s] | |
85%|████████▍ | 100482/118287 [00:23<00:04, 4289.33it/s] | |
77%|███████▋ | 91071/118287 [00:23<00:06, 4082.86it/s] | |
76%|███████▌ | 89577/118287 [00:23<00:07, 4045.78it/s] | |
92%|█████████▏| 109207/118287 [00:23<00:02, 4305.36it/s] | |
90%|████████▉ | 106320/118287 [00:23<00:02, 4180.48it/s] | |
86%|████████▌ | 101825/118287 [00:24<00:03, 4159.64it/s] | |
86%|████████▌ | 101137/118287 [00:23<00:03, 4435.27it/s] | |
61%|██████▏ | 72581/118287 [00:19<00:12, 3659.99it/s] | |
85%|████████▌ | 100912/118287 [00:23<00:04, 4284.57it/s] | |
77%|███████▋ | 91480/118287 [00:23<00:06, 4006.29it/s] | |
76%|███████▌ | 89983/118287 [00:23<00:07, 3973.72it/s] | |
93%|█████████▎| 109639/118287 [00:24<00:02, 4242.22it/s] | |
90%|█████████ | 106739/118287 [00:24<00:02, 4107.28it/s] | |
86%|████████▋ | 102242/118287 [00:24<00:03, 4145.33it/s] | |
86%|████████▌ | 101581/118287 [00:24<00:03, 4388.91it/s] | |
62%|██████▏ | 72949/118287 [00:19<00:12, 3631.26it/s] | |
86%|████████▌ | 101341/118287 [00:24<00:04, 4229.56it/s] | |
76%|███████▋ | 90388/118287 [00:23<00:06, 3987.85it/s] | |
78%|███████▊ | 91882/118287 [00:23<00:06, 3885.31it/s] | |
93%|█████████▎| 110087/118287 [00:24<00:01, 4308.86it/s] | |
91%|█████████ | 107192/118287 [00:24<00:02, 4224.93it/s] | |
87%|████████▋ | 102693/118287 [00:24<00:03, 4248.20it/s] | |
86%|████████▋ | 102023/118287 [00:24<00:03, 4395.95it/s] | |
62%|██████▏ | 73314/118287 [00:19<00:12, 3586.48it/s] | |
86%|████████▌ | 101781/118287 [00:24<00:03, 4277.41it/s] | |
78%|███████▊ | 92272/118287 [00:23<00:06, 3883.05it/s] | |
77%|███████▋ | 90788/118287 [00:23<00:06, 3945.54it/s] | |
93%|█████████▎| 110519/118287 [00:24<00:01, 4247.18it/s] | |
91%|█████████ | 107616/118287 [00:24<00:02, 4211.04it/s] | |
87%|████████▋ | 102463/118287 [00:24<00:03, 4376.85it/s] | |
87%|████████▋ | 103119/118287 [00:24<00:03, 4164.10it/s] | |
86%|████████▋ | 102210/118287 [00:24<00:03, 4236.12it/s] | |
62%|██████▏ | 73674/118287 [00:19<00:12, 3539.38it/s] | |
77%|███████▋ | 91184/118287 [00:23<00:06, 3930.26it/s] | |
78%|███████▊ | 92662/118287 [00:23<00:06, 3822.00it/s] | |
94%|█████████▍| 110945/118287 [00:24<00:01, 4230.50it/s] | |
91%|█████████▏| 108065/118287 [00:24<00:02, 4288.49it/s] | |
87%|████████▋ | 102901/118287 [00:24<00:03, 4369.86it/s] | |
88%|████████▊ | 103537/118287 [00:24<00:03, 4156.85it/s] | |
87%|████████▋ | 102656/118287 [00:24<00:03, 4298.56it/s] | |
63%|██████▎ | 74037/118287 [00:19<00:12, 3565.45it/s] | |
77%|███████▋ | 91578/118287 [00:23<00:06, 3922.37it/s] | |
79%|███████▊ | 93046/118287 [00:23<00:06, 3813.56it/s] | |
94%|█████████▍| 111408/118287 [00:24<00:01, 4338.08it/s] | |
92%|█████████▏| 108495/118287 [00:24<00:02, 4218.02it/s] | |
87%|████████▋ | 103353/118287 [00:24<00:03, 4411.29it/s] | |
88%|████████▊ | 103954/118287 [00:24<00:03, 4054.08it/s] | |
87%|████████▋ | 103087/118287 [00:24<00:03, 4250.55it/s] | |
63%|██████▎ | 74395/118287 [00:19<00:12, 3537.02it/s] | |
78%|███████▊ | 91971/118287 [00:23<00:06, 3869.27it/s] | |
79%|███████▉ | 93428/118287 [00:23<00:06, 3780.26it/s] | |
95%|█████████▍| 111846/118287 [00:24<00:01, 4345.39it/s] | |
92%|█████████▏| 108918/118287 [00:24<00:02, 4144.57it/s] | |
88%|████████▊ | 103795/118287 [00:24<00:03, 4346.87it/s] | |
88%|████████▊ | 104379/118287 [00:24<00:03, 4109.13it/s] | |
88%|████████▊ | 103518/118287 [00:24<00:03, 4266.23it/s] | |
63%|██████▎ | 74750/118287 [00:19<00:12, 3479.29it/s] | |
78%|███████▊ | 92375/118287 [00:23<00:06, 3912.80it/s] | |
95%|█████████▍| 112282/118287 [00:24<00:01, 4339.91it/s] | |
79%|███████▉ | 93807/118287 [00:23<00:06, 3750.29it/s] | |
92%|█████████▏| 109334/118287 [00:24<00:02, 4072.04it/s] | |
88%|████████▊ | 104231/118287 [00:24<00:03, 4286.97it/s] | |
89%|████████▊ | 104803/118287 [00:25<00:03, 4141.46it/s] | |
63%|██████▎ | 75099/118287 [00:19<00:12, 3459.90it/s] | |
88%|████████▊ | 103945/118287 [00:24<00:03, 4165.07it/s] | |
78%|███████▊ | 92767/118287 [00:24<00:06, 3894.73it/s] | |
80%|███████▉ | 94189/118287 [00:23<00:06, 3769.60it/s] | |
95%|█████████▌| 112717/118287 [00:24<00:01, 4265.42it/s] | |
93%|█████████▎| 109743/118287 [00:24<00:02, 4046.66it/s] | |
88%|████████▊ | 104663/118287 [00:24<00:03, 4291.26it/s] | |
89%|████████▉ | 105219/118287 [00:25<00:03, 4144.90it/s] | |
88%|████████▊ | 104383/118287 [00:24<00:03, 4227.18it/s] | |
64%|██████▍ | 75446/118287 [00:20<00:12, 3399.38it/s] | |
80%|███████▉ | 94567/118287 [00:24<00:06, 3754.72it/s] | |
79%|███████▉ | 93157/118287 [00:24<00:06, 3827.64it/s] | |
96%|█████████▌| 113154/118287 [00:24<00:01, 4292.87it/s] | |
93%|█████████▎| 110184/118287 [00:24<00:01, 4147.46it/s] | |
89%|████████▉ | 105093/118287 [00:24<00:03, 4271.86it/s] | |
89%|████████▉ | 105634/118287 [00:25<00:03, 4039.50it/s] | |
89%|████████▊ | 104825/118287 [00:24<00:03, 4281.25it/s] | |
64%|██████▍ | 75824/118287 [00:20<00:12, 3504.86it/s] | |
80%|████████ | 94959/118287 [00:24<00:06, 3799.71it/s] | |
79%|███████▉ | 93584/118287 [00:24<00:06, 3948.56it/s] | |
96%|█████████▌| 113584/118287 [00:25<00:01, 4294.04it/s] | |
94%|█████████▎| 110600/118287 [00:24<00:01, 4068.37it/s] | |
90%|████████▉ | 106068/118287 [00:25<00:02, 4119.19it/s] | |
89%|████████▉ | 105521/118287 [00:25<00:03, 4141.00it/s] | |
89%|████████▉ | 105254/118287 [00:24<00:03, 4265.21it/s] | |
64%|██████▍ | 76193/118287 [00:20<00:11, 3558.39it/s] | |
79%|███████▉ | 93992/118287 [00:24<00:06, 3986.72it/s] | |
81%|████████ | 95340/118287 [00:24<00:06, 3751.61it/s] | |
96%|█████████▋| 114014/118287 [00:25<00:01, 4236.08it/s] | |
94%|█████████▍| 111026/118287 [00:25<00:01, 4121.05it/s] | |
90%|█████████ | 106481/118287 [00:25<00:02, 4116.68it/s] | |
90%|████████▉ | 105971/118287 [00:25<00:02, 4242.13it/s] | |
65%|██████▍ | 76560/118287 [00:20<00:11, 3589.83it/s] | |
89%|████████▉ | 105682/118287 [00:25<00:03, 4158.54it/s] | |
80%|███████▉ | 94392/118287 [00:24<00:06, 3935.87it/s] | |
97%|█████████▋| 114439/118287 [00:25<00:00, 4213.29it/s] | |
81%|████████ | 95716/118287 [00:24<00:06, 3675.31it/s] | |
94%|█████████▍| 111458/118287 [00:25<00:01, 4177.25it/s] | |
90%|█████████ | 106897/118287 [00:25<00:02, 4125.53it/s] | |
90%|████████▉ | 106397/118287 [00:25<00:02, 4206.63it/s] | |
65%|██████▌ | 76926/118287 [00:20<00:11, 3606.59it/s] | |
90%|████████▉ | 106116/118287 [00:25<00:02, 4209.75it/s] | |
80%|████████ | 94805/118287 [00:24<00:05, 3990.72it/s] | |
97%|█████████▋| 114876/118287 [00:25<00:00, 4257.58it/s] | |
81%|████████ | 96085/118287 [00:24<00:06, 3656.98it/s] | |
95%|█████████▍| 111877/118287 [00:25<00:01, 4130.03it/s] | |
91%|█████████ | 107342/118287 [00:25<00:02, 4217.04it/s] | |
90%|█████████ | 106819/118287 [00:25<00:02, 4173.21it/s] | |
65%|██████▌ | 77316/118287 [00:20<00:11, 3689.47it/s] | |
90%|█████████ | 106538/118287 [00:25<00:02, 4185.10it/s] | |
80%|████████ | 95205/118287 [00:24<00:05, 3986.32it/s] | |
82%|████████▏ | 96490/118287 [00:24<00:05, 3766.09it/s] | |
97%|█████████▋| 115303/118287 [00:25<00:00, 4176.99it/s] | |
95%|█████████▍| 112293/118287 [00:25<00:01, 4137.21it/s] | |
91%|█████████ | 107765/118287 [00:25<00:02, 4206.07it/s] | |
91%|█████████ | 107282/118287 [00:25<00:02, 4299.94it/s] | |
66%|██████▌ | 77704/118287 [00:20<00:10, 3743.63it/s] | |
90%|█████████ | 106958/118287 [00:25<00:02, 4174.60it/s] | |
81%|████████ | 95605/118287 [00:24<00:05, 3963.79it/s] | |
82%|████████▏ | 96868/118287 [00:24<00:05, 3731.34it/s] | |
98%|█████████▊| 115735/118287 [00:25<00:00, 4218.80it/s] | |
95%|█████████▌| 112708/118287 [00:25<00:01, 4062.31it/s] | |
91%|█████████▏| 108187/118287 [00:25<00:02, 4206.54it/s] | |
91%|█████████ | 107714/118287 [00:25<00:02, 4248.95it/s] | |
66%|██████▌ | 78084/118287 [00:20<00:10, 3760.22it/s] | |
91%|█████████ | 107414/118287 [00:25<00:02, 4277.34it/s] | |
81%|████████ | 96002/118287 [00:24<00:05, 3934.35it/s] | |
82%|████████▏ | 97252/118287 [00:24<00:05, 3762.20it/s] | |
98%|█████████▊| 116182/118287 [00:25<00:00, 4287.65it/s] | |
96%|█████████▌| 113117/118287 [00:25<00:01, 4069.29it/s] | |
92%|█████████▏| 108609/118287 [00:25<00:02, 4174.26it/s] | |
91%|█████████▏| 108153/118287 [00:25<00:02, 4288.39it/s] | |
91%|█████████ | 107843/118287 [00:25<00:02, 4251.71it/s] | |
66%|██████▋ | 78461/118287 [00:20<00:10, 3648.53it/s] | |
82%|████████▏ | 96411/118287 [00:24<00:05, 3979.30it/s] | |
83%|████████▎ | 97629/118287 [00:24<00:05, 3749.86it/s] | |
99%|█████████▊| 116612/118287 [00:25<00:00, 4257.22it/s] | |
96%|█████████▌| 113527/118287 [00:25<00:01, 4076.88it/s] | |
92%|█████████▏| 109027/118287 [00:26<00:02, 4086.83it/s] | |
92%|█████████▏| 108583/118287 [00:25<00:02, 4210.73it/s] | |
92%|█████████▏| 108282/118287 [00:25<00:02, 4288.27it/s] | |
67%|██████▋ | 78827/118287 [00:21<00:11, 3575.95it/s] | |
82%|████████▏ | 96816/118287 [00:25<00:05, 3998.89it/s] | |
83%|████████▎ | 98013/118287 [00:24<00:05, 3776.06it/s] | |
99%|█████████▉| 117039/118287 [00:25<00:00, 4237.54it/s] | |
96%|█████████▋| 113936/118287 [00:25<00:01, 4041.40it/s] | |
93%|█████████▎| 109437/118287 [00:26<00:02, 3978.67it/s] | |
92%|█████████▏| 109006/118287 [00:25<00:02, 4151.65it/s] | |
92%|█████████▏| 108712/118287 [00:25<00:02, 4212.60it/s] | |
67%|██████▋ | 79201/118287 [00:21<00:10, 3622.65it/s] | |
82%|████████▏ | 97230/118287 [00:25<00:05, 4032.90it/s] | |
83%|████████▎ | 98415/118287 [00:25<00:05, 3845.03it/s] | |
99%|█████████▉| 117464/118287 [00:25<00:00, 4170.95it/s] | |
97%|█████████▋| 114341/118287 [00:25<00:00, 4020.78it/s] | |
93%|█████████▎| 109839/118287 [00:26<00:02, 3987.53it/s] | |
93%|█████████▎| 109423/118287 [00:25<00:02, 4024.03it/s] | |
67%|██████▋ | 79565/118287 [00:21<00:10, 3621.11it/s] | |
92%|█████████▏| 109134/118287 [00:25<00:02, 4135.24it/s] | |
83%|████████▎ | 97634/118287 [00:25<00:05, 3981.54it/s] | |
84%|████████▎ | 98801/118287 [00:25<00:05, 3798.28it/s] | |
100%|█████████▉| 117882/118287 [00:26<00:00, 4094.79it/s] | |
97%|█████████▋| 114754/118287 [00:25<00:00, 4052.84it/s] | |
93%|█████████▎| 110259/118287 [00:26<00:01, 4046.72it/s] | |
93%|█████████▎| 109836/118287 [00:26<00:02, 4045.69it/s] | |
68%|██████▊ | 79928/118287 [00:21<00:10, 3555.76it/s] | |
93%|█████████▎| 109549/118287 [00:25<00:02, 4032.33it/s] | |
83%|████████▎ | 98038/118287 [00:25<00:05, 3995.21it/s] | |
84%|████████▍ | 99182/118287 [00:25<00:05, 3754.84it/s] | |
100%|██████████| 118287/118287 [00:26<00:00, 4524.22it/s][32m[0308 18:58:42 @timer.py:48][0m Load Groundtruth Boxes for train2017 finished, time:26.2398sec. | |
97%|█████████▋| 115160/118287 [00:26<00:00, 4039.65it/s] | |
94%|█████████▎| 110665/118287 [00:26<00:01, 3974.36it/s] | |
93%|█████████▎| 110268/118287 [00:26<00:01, 4123.81it/s] | |
68%|██████▊ | 80309/118287 [00:21<00:10, 3627.67it/s] | |
93%|█████████▎| 109971/118287 [00:26<00:02, 4085.65it/s] | |
83%|████████▎ | 98456/118287 [00:25<00:04, 4044.42it/s] | |
84%|████████▍ | 99583/118287 [00:25<00:04, 3824.85it/s] | |
98%|█████████▊| 115565/118287 [00:26<00:00, 3979.93it/s] | |
94%|█████████▍| 111081/118287 [00:26<00:01, 4026.71it/s] | |
94%|█████████▎| 110682/118287 [00:26<00:01, 4067.16it/s] | |
68%|██████▊ | 80676/118287 [00:21<00:10, 3637.78it/s] | |
93%|█████████▎| 110387/118287 [00:26<00:01, 4105.78it/s] | |
84%|████████▎ | 98861/118287 [00:25<00:04, 4018.69it/s] | |
85%|████████▍ | 99967/118287 [00:25<00:04, 3803.71it/s] | |
98%|█████████▊| 115984/118287 [00:26<00:00, 4036.57it/s] | |
94%|█████████▍| 111489/118287 [00:26<00:01, 4042.15it/s] | |
94%|█████████▍| 111099/118287 [00:26<00:01, 4097.14it/s] | |
69%|██████▊ | 81061/118287 [00:21<00:10, 3690.46it/s] | |
94%|█████████▎| 110799/118287 [00:26<00:01, 4049.47it/s] | |
84%|████████▍ | 99264/118287 [00:25<00:04, 3955.85it/s] | |
85%|████████▍ | 100348/118287 [00:25<00:04, 3746.54it/s] | |
98%|█████████▊| 116391/118287 [00:26<00:00, 4041.64it/s] | |
95%|█████████▍| 111894/118287 [00:26<00:01, 4018.02it/s] | |
94%|█████████▍| 111527/118287 [00:26<00:01, 4148.38it/s] | |
69%|██████▉ | 81455/118287 [00:21<00:09, 3760.54it/s] | |
94%|█████████▍| 111242/118287 [00:26<00:01, 4155.25it/s] | |
84%|████████▍ | 99665/118287 [00:25<00:04, 3964.64it/s] | |
85%|████████▌ | 100728/118287 [00:25<00:04, 3761.73it/s] | |
99%|█████████▊| 116796/118287 [00:26<00:00, 4010.49it/s] | |
95%|█████████▍| 112297/118287 [00:26<00:01, 4021.57it/s] | |
69%|██████▉ | 81832/118287 [00:21<00:09, 3723.14it/s] | |
95%|█████████▍| 111943/118287 [00:26<00:01, 4072.05it/s] | |
94%|█████████▍| 111659/118287 [00:26<00:01, 4145.02it/s] | |
85%|████████▍ | 100062/118287 [00:25<00:04, 3944.38it/s] | |
85%|████████▌ | 101105/118287 [00:25<00:04, 3716.20it/s] | |
99%|█████████▉| 117203/118287 [00:26<00:00, 4026.96it/s] | |
95%|█████████▌| 112700/118287 [00:27<00:01, 3956.23it/s] | |
95%|█████████▍| 112364/118287 [00:26<00:01, 4111.48it/s] | |
95%|█████████▍| 112075/118287 [00:26<00:01, 4131.56it/s] | |
69%|██████▉ | 82205/118287 [00:21<00:10, 3603.81it/s] | |
85%|████████▍ | 100472/118287 [00:25<00:04, 3989.56it/s] | |
86%|████████▌ | 101478/118287 [00:25<00:04, 3661.35it/s] | |
99%|█████████▉| 117606/118287 [00:26<00:00, 4007.16it/s] | |
96%|█████████▌| 113097/118287 [00:27<00:01, 3913.17it/s] | |
95%|█████████▌| 112509/118287 [00:26<00:01, 4191.22it/s] | |
95%|█████████▌| 112776/118287 [00:26<00:01, 3975.58it/s] | |
85%|████████▌ | 100872/118287 [00:26<00:04, 3922.71it/s] | |
70%|██████▉ | 82567/118287 [00:22<00:10, 3415.46it/s] | |
86%|████████▌ | 101865/118287 [00:25<00:04, 3721.47it/s] | |
100%|█████████▉| 118007/118287 [00:26<00:00, 3914.83it/s] | |
96%|█████████▌| 113503/118287 [00:27<00:01, 3955.48it/s] | |
96%|█████████▌| 113200/118287 [00:26<00:01, 4049.12it/s] | |
95%|█████████▌| 112929/118287 [00:26<00:01, 4001.30it/s] | |
86%|████████▌ | 101265/118287 [00:26<00:04, 3881.93it/s] | |
86%|████████▋ | 102238/118287 [00:26<00:04, 3675.35it/s] | |
70%|███████ | 82912/118287 [00:22<00:10, 3271.13it/s] | |
100%|██████████| 118287/118287 [00:26<00:00, 4404.52it/s][32m[0308 18:58:43 @timer.py:48][0m Load Groundtruth Boxes for train2017 finished, time:26.9406sec. | |
96%|█████████▋| 113899/118287 [00:27<00:01, 3923.54it/s] | |
96%|█████████▌| 113607/118287 [00:26<00:01, 4015.92it/s] | |
96%|█████████▌| 113366/118287 [00:26<00:01, 4096.93it/s] | |
86%|████████▌ | 101654/118287 [00:26<00:04, 3865.41it/s] | |
87%|████████▋ | 102641/118287 [00:26<00:04, 3774.80it/s] | |
70%|███████ | 83243/118287 [00:22<00:11, 3170.09it/s] | |
97%|█████████▋| 114292/118287 [00:27<00:01, 3868.80it/s] | |
96%|█████████▋| 114010/118287 [00:27<00:01, 3999.70it/s] | |
86%|████████▋ | 102041/118287 [00:26<00:04, 3863.65it/s] | |
96%|█████████▌| 113778/118287 [00:27<00:01, 4028.49it/s] | |
87%|████████▋ | 103020/118287 [00:26<00:04, 3715.64it/s] | |
71%|███████ | 83564/118287 [00:22<00:11, 3118.91it/s] | |
97%|█████████▋| 114684/118287 [00:27<00:00, 3881.59it/s] | |
97%|█████████▋| 114411/118287 [00:27<00:00, 3973.61it/s] | |
87%|████████▋ | 102428/118287 [00:26<00:04, 3844.19it/s] | |
97%|█████████▋| 114183/118287 [00:27<00:01, 3966.60it/s] | |
87%|████████▋ | 103399/118287 [00:26<00:03, 3737.04it/s]Done loading roidbs | |
71%|███████ | 83906/118287 [00:22<00:10, 3203.49it/s] | |
97%|█████████▋| 115080/118287 [00:27<00:00, 3902.15it/s] | |
97%|█████████▋| 114817/118287 [00:27<00:00, 3998.49it/s] | |
87%|████████▋ | 102818/118287 [00:26<00:04, 3858.36it/s] | |
97%|█████████▋| 114605/118287 [00:27<00:00, 4038.18it/s] | |
88%|████████▊ | 103774/118287 [00:26<00:03, 3736.04it/s] | |
71%|███████ | 84232/118287 [00:22<00:10, 3219.31it/s] | |
98%|█████████▊| 115471/118287 [00:27<00:00, 3841.83it/s] | |
97%|█████████▋| 115218/118287 [00:27<00:00, 3951.85it/s] | |
87%|████████▋ | 103230/118287 [00:26<00:03, 3929.91it/s] | |
97%|█████████▋| 115018/118287 [00:27<00:00, 4064.30it/s] | |
88%|████████▊ | 104149/118287 [00:26<00:03, 3701.82it/s] | |
71%|███████▏ | 84556/118287 [00:22<00:10, 3129.58it/s] | |
98%|█████████▊| 115885/118287 [00:27<00:00, 3926.40it/s] | |
98%|█████████▊| 115614/118287 [00:27<00:00, 3896.40it/s] | |
98%|█████████▊| 115426/118287 [00:27<00:00, 4004.23it/s] | |
88%|████████▊ | 103624/118287 [00:26<00:03, 3795.12it/s] | |
88%|████████▊ | 104528/118287 [00:26<00:03, 3726.57it/s] | |
72%|███████▏ | 84889/118287 [00:22<00:10, 3185.51it/s] | |
98%|█████████▊| 116292/118287 [00:27<00:00, 3966.81it/s] | |
98%|█████████▊| 116051/118287 [00:27<00:00, 4026.26it/s] | |
98%|█████████▊| 115835/118287 [00:27<00:00, 4027.69it/s] | |
88%|████████▊ | 104011/118287 [00:26<00:03, 3808.55it/s] | |
89%|████████▊ | 104901/118287 [00:26<00:03, 3723.95it/s] | |
72%|███████▏ | 85212/118287 [00:22<00:10, 3195.45it/s] | |
99%|█████████▊| 116695/118287 [00:28<00:00, 3984.87it/s] | |
98%|█████████▊| 116456/118287 [00:27<00:00, 4026.18it/s] | |
98%|█████████▊| 116252/118287 [00:27<00:00, 4064.01it/s] | |
88%|████████▊ | 104399/118287 [00:26<00:03, 3826.23it/s] | |
89%|████████▉ | 105294/118287 [00:26<00:03, 3778.58it/s] | |
72%|███████▏ | 85556/118287 [00:22<00:10, 3264.04it/s][32m[0308 18:58:44 @data.py:335][0m Filtered 1021 images which contain no non-crowd groudtruth boxes. Total #images for training: 117266 | |
Batching roidbs | |
99%|█████████▉| 117102/118287 [00:28<00:00, 4004.28it/s] | |
99%|█████████▉| 116860/118287 [00:27<00:00, 3970.92it/s] | |
89%|████████▊ | 104795/118287 [00:27<00:03, 3865.30it/s] | |
99%|█████████▊| 116659/118287 [00:27<00:00, 4043.60it/s] | |
89%|████████▉ | 105673/118287 [00:26<00:03, 3626.11it/s] | |
73%|███████▎ | 85884/118287 [00:23<00:09, 3261.40it/s] | |
99%|█████████▉| 117503/118287 [00:28<00:00, 3949.90it/s] | |
99%|█████████▉| 117264/118287 [00:27<00:00, 3989.89it/s] | |
89%|████████▉ | 105183/118287 [00:27<00:03, 3861.24it/s] | |
99%|█████████▉| 117077/118287 [00:27<00:00, 4078.67it/s] | |
90%|████████▉ | 106060/118287 [00:27<00:03, 3695.67it/s] | |
73%|███████▎ | 86273/118287 [00:23<00:09, 3421.49it/s] | |
100%|█████████▉| 117899/118287 [00:28<00:00, 3889.75it/s] | |
99%|█████████▉| 117673/118287 [00:28<00:00, 4014.80it/s] | |
99%|█████████▉| 117486/118287 [00:27<00:00, 4023.16it/s] | |
89%|████████▉ | 105570/118287 [00:27<00:03, 3747.55it/s] | |
90%|████████▉ | 106431/118287 [00:27<00:03, 3683.07it/s] | |
73%|███████▎ | 86639/118287 [00:23<00:09, 3488.56it/s] | |
100%|██████████| 118287/118287 [00:28<00:00, 4156.66it/s][32m[0308 18:58:44 @timer.py:48][0m Load Groundtruth Boxes for train2017 finished, time:28.5410sec. | |
100%|█████████▉| 118075/118287 [00:28<00:00, 3849.88it/s] | |
100%|█████████▉| 117889/118287 [00:28<00:00, 3998.67it/s] | |
90%|████████▉ | 105981/118287 [00:27<00:03, 3848.23it/s] | |
90%|█████████ | 106801/118287 [00:27<00:03, 3668.52it/s] | |
74%|███████▎ | 86990/118287 [00:23<00:09, 3445.71it/s] | |
100%|██████████| 118287/118287 [00:28<00:00, 4196.77it/s][32m[0308 18:58:44 @timer.py:48][0m Load Groundtruth Boxes for train2017 finished, time:28.2762sec. | |
90%|████████▉ | 106368/118287 [00:27<00:03, 3801.97it/s] | |
100%|██████████| 118287/118287 [00:28<00:00, 4202.75it/s][32m[0308 18:58:44 @timer.py:48][0m Load Groundtruth Boxes for train2017 finished, time:28.2322sec. | |
91%|█████████ | 107193/118287 [00:27<00:02, 3739.60it/s]Done loading roidbs | |
74%|███████▍ | 87351/118287 [00:23<00:08, 3490.84it/s] | |
90%|█████████ | 106750/118287 [00:27<00:03, 3765.09it/s] | |
91%|█████████ | 107568/118287 [00:27<00:02, 3739.33it/s] | |
74%|███████▍ | 87703/118287 [00:23<00:08, 3498.49it/s] | |
91%|█████████ | 107152/118287 [00:27<00:02, 3837.56it/s] | |
91%|█████████▏| 107944/118287 [00:27<00:02, 3745.35it/s] | |
74%|███████▍ | 88064/118287 [00:23<00:08, 3526.69it/s] | |
91%|█████████ | 107547/118287 [00:27<00:02, 3869.00it/s] | |
92%|█████████▏| 108324/118287 [00:27<00:02, 3756.51it/s] | |
75%|███████▍ | 88426/118287 [00:23<00:08, 3548.25it/s] | |
91%|█████████▏| 107946/118287 [00:27<00:02, 3904.12it/s] | |
92%|█████████▏| 108700/118287 [00:27<00:02, 3669.75it/s] | |
75%|███████▌ | 88782/118287 [00:23<00:08, 3515.45it/s] | |
92%|█████████▏| 108337/118287 [00:27<00:02, 3900.35it/s] | |
92%|█████████▏| 109068/118287 [00:27<00:02, 3638.92it/s] | |
75%|███████▌ | 89134/118287 [00:24<00:08, 3503.51it/s][32m[0308 18:58:45 @data.py:335][0m Filtered 1021 images which contain no non-crowd groudtruth boxes. Total #images for training: 117266 | |
Batching roidbs | |
92%|█████████▏| 108728/118287 [00:28<00:02, 3816.96it/s] | |
76%|███████▌ | 89532/118287 [00:24<00:07, 3630.81it/s] | |
93%|█████████▎| 109433/118287 [00:28<00:02, 3522.65it/s] | |
92%|█████████▏| 109111/118287 [00:28<00:02, 3781.88it/s] | |
76%|███████▌ | 89897/118287 [00:24<00:07, 3592.48it/s] | |
93%|█████████▎| 109804/118287 [00:28<00:02, 3575.01it/s] | |
93%|█████████▎| 109490/118287 [00:28<00:02, 3704.30it/s] | |
93%|█████████▎| 110185/118287 [00:28<00:02, 3642.38it/s] | |
76%|███████▋ | 90261/118287 [00:24<00:07, 3599.63it/s] | |
93%|█████████▎| 109868/118287 [00:28<00:02, 3726.49it/s] | |
77%|███████▋ | 90622/118287 [00:24<00:07, 3576.04it/s] | |
93%|█████████▎| 110551/118287 [00:28<00:02, 3560.84it/s] | |
93%|█████████▎| 110259/118287 [00:28<00:02, 3777.68it/s] | |
77%|███████▋ | 90999/118287 [00:24<00:07, 3629.98it/s] | |
94%|█████████▍| 110913/118287 [00:28<00:02, 3572.89it/s] | |
94%|█████████▎| 110638/118287 [00:28<00:02, 3694.58it/s] | |
77%|███████▋ | 91366/118287 [00:24<00:07, 3641.78it/s] | |
94%|█████████▍| 111307/118287 [00:28<00:01, 3674.09it/s]Done loading roidbs | |
94%|█████████▍| 111029/118287 [00:28<00:01, 3753.09it/s] | |
78%|███████▊ | 91731/118287 [00:24<00:07, 3592.42it/s] | |
94%|█████████▍| 111680/118287 [00:28<00:01, 3690.31it/s]Done loading roidbs | |
Done loading roidbs | |
94%|█████████▍| 111415/118287 [00:28<00:01, 3777.96it/s] | |
95%|█████████▍| 112055/118287 [00:28<00:01, 3706.34it/s] | |
78%|███████▊ | 92091/118287 [00:24<00:07, 3567.14it/s] | |
95%|█████████▍| 111798/118287 [00:28<00:01, 3789.61it/s] | |
95%|█████████▌| 112454/118287 [00:28<00:01, 3782.12it/s] | |
78%|███████▊ | 92473/118287 [00:24<00:07, 3638.62it/s] | |
95%|█████████▍| 112178/118287 [00:29<00:01, 3780.56it/s] | |
78%|███████▊ | 92838/118287 [00:25<00:07, 3540.26it/s] | |
95%|█████████▌| 112834/118287 [00:28<00:01, 3645.32it/s] | |
95%|█████████▌| 112566/118287 [00:29<00:01, 3801.17it/s] | |
96%|█████████▌| 113224/118287 [00:29<00:01, 3715.68it/s] | |
79%|███████▉ | 93194/118287 [00:25<00:07, 3498.53it/s] | |
95%|█████████▌| 112947/118287 [00:29<00:01, 3693.18it/s][32m[0308 18:58:46 @data.py:335][0m Filtered 1021 images which contain no non-crowd groudtruth boxes. Total #images for training: 117266 | |
Batching roidbs | |
96%|█████████▌| 113598/118287 [00:29<00:01, 3703.96it/s] | |
79%|███████▉ | 93577/118287 [00:25<00:06, 3588.06it/s][32m[0308 18:58:46 @data.py:335][0m Filtered 1021 images which contain no non-crowd groudtruth boxes. Total #images for training: 117266 | |
Batching roidbs | |
[32m[0308 18:58:46 @data.py:335][0m Filtered 1021 images which contain no non-crowd groudtruth boxes. Total #images for training: 117266 | |
Batching roidbs | |
96%|█████████▌| 113350/118287 [00:29<00:01, 3788.06it/s] | |
96%|█████████▋| 113970/118287 [00:29<00:01, 3703.63it/s] | |
79%|███████▉ | 93959/118287 [00:25<00:06, 3654.17it/s] | |
96%|█████████▌| 113730/118287 [00:29<00:01, 3740.59it/s] | |
97%|█████████▋| 114342/118287 [00:29<00:01, 3679.77it/s] | |
80%|███████▉ | 94326/118287 [00:25<00:06, 3596.49it/s] | |
96%|█████████▋| 114106/118287 [00:29<00:01, 3685.46it/s] | |
97%|█████████▋| 114712/118287 [00:29<00:00, 3683.57it/s] | |
80%|████████ | 94687/118287 [00:25<00:06, 3592.73it/s] | |
97%|█████████▋| 114480/118287 [00:29<00:01, 3700.87it/s] | |
97%|█████████▋| 115092/118287 [00:29<00:00, 3708.99it/s] | |
80%|████████ | 95063/118287 [00:25<00:06, 3635.28it/s] | |
97%|█████████▋| 114854/118287 [00:29<00:00, 3711.44it/s] | |
98%|█████████▊| 115464/118287 [00:29<00:00, 3655.76it/s] | |
81%|████████ | 95428/118287 [00:25<00:06, 3589.94it/s] | |
97%|█████████▋| 115226/118287 [00:29<00:00, 3682.32it/s] | |
98%|█████████▊| 115860/118287 [00:29<00:00, 3735.56it/s] | |
81%|████████ | 95788/118287 [00:25<00:06, 3572.15it/s] | |
98%|█████████▊| 115595/118287 [00:29<00:00, 3646.79it/s] | |
98%|█████████▊| 116237/118287 [00:29<00:00, 3733.49it/s] | |
81%|████████▏ | 96146/118287 [00:25<00:06, 3512.25it/s] | |
98%|█████████▊| 115995/118287 [00:30<00:00, 3743.52it/s] | |
99%|█████████▊| 116611/118287 [00:29<00:00, 3714.30it/s] | |
82%|████████▏ | 96504/118287 [00:26<00:06, 3526.40it/s]Done batching roidbs | |
[32m[0308 18:58:47 @train.py:577][0m Total passes of the training set is: 24.56 | |
[32m[0308 18:58:47 @trainers.py:391][0m [HorovodTrainer] local rank=1 | |
[32m[0308 18:58:47 @input_source.py:220][0m Setting up the queue 'QueueInput/input_queue' for CPU prefetching ... | |
98%|█████████▊| 116374/118287 [00:30<00:00, 3750.40it/s] | |
WARNING: Logging before flag parsing goes to stderr. | |
W0308 18:58:47.295874 140655740700416 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/moving_averages.py:210: calling Zeros.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
99%|█████████▉| 116991/118287 [00:30<00:00, 3732.10it/s] | |
82%|████████▏ | 96858/118287 [00:26<00:06, 3483.68it/s][32m[0308 18:58:47 @registry.py:125][0m conv0 input: [None, 3, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
W0308 18:58:47.345530 140655740700416 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:1253: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:47 @registry.py:133][0m conv0 output: [None, 64, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m pool0 input: [None, 64, None, None] | |
[32m[0308 18:58:47 @registry.py:133][0m pool0 output: [None, 64, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block0/conv1 input: [None, 64, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
99%|█████████▊| 116750/118287 [00:30<00:00, 3734.16it/s] | |
99%|█████████▉| 117365/118287 [00:30<00:00, 3688.03it/s][32m[0308 18:58:47 @registry.py:133][0m group0/block0/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block0/conv2 input: [None, 64, None, None] | |
82%|████████▏ | 97207/118287 [00:26<00:06, 3436.86it/s][32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group0/block0/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block0/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group0/block0/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block0/convshortcut input: [None, 64, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
99%|█████████▉| 117124/118287 [00:30<00:00, 3713.31it/s] | |
100%|█████████▉| 117751/118287 [00:30<00:00, 3736.65it/s][32m[0308 18:58:47 @registry.py:133][0m group0/block0/convshortcut output: [None, 256, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block1/conv1 input: [None, 256, None, None] | |
82%|████████▏ | 97552/118287 [00:26<00:06, 3394.00it/s][32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group0/block1/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block1/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group0/block1/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block1/conv3 input: [None, 64, None, None] | |
99%|█████████▉| 117496/118287 [00:30<00:00, 3683.71it/s][32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
100%|█████████▉| 118126/118287 [00:30<00:00, 3616.72it/s][32m[0308 18:58:47 @registry.py:133][0m group0/block1/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block2/conv1 input: [None, 256, None, None] | |
83%|████████▎ | 97892/118287 [00:26<00:06, 3345.53it/s][32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group0/block2/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block2/conv2 input: [None, 64, None, None] | |
100%|██████████| 118287/118287 [00:30<00:00, 3888.47it/s][32m[0308 18:58:47 @timer.py:48][0m Load Groundtruth Boxes for train2017 finished, time:30.5044sec. | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group0/block2/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group0/block2/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
100%|█████████▉| 117865/118287 [00:30<00:00, 3601.93it/s] | |
83%|████████▎ | 98258/118287 [00:26<00:05, 3427.91it/s][32m[0308 18:58:47 @registry.py:133][0m group0/block2/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group1/block0/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group1/block0/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group1/block0/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group1/block0/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group1/block0/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
100%|█████████▉| 118226/118287 [00:30<00:00, 3500.43it/s] | |
83%|████████▎ | 98606/118287 [00:26<00:05, 3441.03it/s] | |
100%|██████████| 118287/118287 [00:30<00:00, 3854.01it/s][32m[0308 18:58:47 @timer.py:48][0m Load Groundtruth Boxes for train2017 finished, time:30.7709sec. | |
[32m[0308 18:58:47 @registry.py:133][0m group1/block0/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group1/block0/convshortcut input: [None, 256, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group1/block0/convshortcut output: [None, 512, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group1/block1/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group1/block1/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group1/block1/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
84%|████████▎ | 98951/118287 [00:26<00:05, 3384.77it/s][32m[0308 18:58:47 @registry.py:133][0m group1/block1/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group1/block1/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:47 @registry.py:133][0m group1/block1/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:47 @registry.py:125][0m group1/block2/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:47 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block2/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block2/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
84%|████████▍ | 99291/118287 [00:26<00:05, 3333.45it/s][32m[0308 18:58:48 @registry.py:133][0m group1/block2/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block2/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block2/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block3/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block3/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block3/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
84%|████████▍ | 99642/118287 [00:26<00:05, 3384.09it/s]Done batching roidbs | |
[32m[0308 18:58:48 @train.py:577][0m Total passes of the training set is: 24.56 | |
[32m[0308 18:58:48 @trainers.py:391][0m [HorovodTrainer] local rank=5 | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block3/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block3/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @input_source.py:220][0m Setting up the queue 'QueueInput/input_queue' for CPU prefetching ... | |
WARNING: Logging before flag parsing goes to stderr. | |
W0308 18:58:48.169555 139695895922432 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/moving_averages.py:210: calling Zeros.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block3/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block0/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:125][0m conv0 input: [None, 3, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
W0308 18:58:48.219148 139695895922432 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:1253: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block0/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block0/conv2 input: [None, 256, None, None] | |
85%|████████▍ | 99992/118287 [00:27<00:05, 3413.38it/s][32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m conv0 output: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m pool0 input: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m pool0 output: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block0/conv1 input: [None, 64, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block0/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block0/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block0/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block0/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block0/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block0/convshortcut input: [None, 512, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block0/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block0/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
85%|████████▍ | 100334/118287 [00:27<00:05, 3399.25it/s][32m[0308 18:58:48 @registry.py:133][0m group2/block0/convshortcut output: [None, 1024, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block1/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block0/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block0/convshortcut input: [None, 64, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block1/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block1/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block0/convshortcut output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block1/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block1/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block1/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block1/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block1/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
85%|████████▌ | 100696/118287 [00:27<00:05, 3461.76it/s][32m[0308 18:58:48 @registry.py:133][0m group2/block1/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block2/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block1/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block1/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block1/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block2/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block2/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block2/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block2/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block2/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block2/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block2/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
85%|████████▌ | 101051/118287 [00:27<00:04, 3485.35it/s][32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block2/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group0/block2/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block2/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block3/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group0/block2/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block0/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block3/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block3/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block0/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block0/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block3/conv2 output: [None, 256, None, None] | |
86%|████████▌ | 101400/118287 [00:27<00:04, 3442.33it/s][32m[0308 18:58:48 @registry.py:125][0m group2/block3/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block0/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block0/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block3/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block4/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block0/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block0/convshortcut input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block4/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block4/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block0/convshortcut output: [None, 512, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block1/conv1 input: [None, 512, None, None] | |
86%|████████▌ | 101764/118287 [00:27<00:04, 3493.25it/s][32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block4/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block4/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block1/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block1/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block4/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block5/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block1/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block1/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block5/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block5/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block1/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block2/conv1 input: [None, 512, None, None] | |
86%|████████▋ | 102114/118287 [00:27<00:04, 3436.53it/s][32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block5/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group2/block5/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block2/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block2/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group2/block5/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group3/block0/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block2/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block2/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group3/block0/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group3/block0/conv2 input: [None, 512, None, None] | |
87%|████████▋ | 102460/118287 [00:27<00:04, 3443.51it/s][32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block2/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block3/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group3/block0/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group3/block0/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:48 @registry.py:133][0m group1/block3/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:48 @registry.py:125][0m group1/block3/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:48 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group3/block0/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block0/convshortcut input: [None, 1024, None, None] | |
Done loading roidbs | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block3/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block3/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
87%|████████▋ | 102815/118287 [00:27<00:04, 3474.08it/s][32m[0308 18:58:49 @registry.py:133][0m group3/block0/convshortcut output: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block1/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block3/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block0/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group3/block1/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block1/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block0/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block0/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group3/block1/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block1/conv3 input: [None, 512, None, None] | |
Done loading roidbs | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block0/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block0/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
87%|████████▋ | 103202/118287 [00:27<00:04, 3581.31it/s][32m[0308 18:58:49 @registry.py:133][0m group3/block1/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block2/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block0/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block0/convshortcut input: [None, 512, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group3/block2/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block2/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block0/convshortcut output: [None, 1024, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block1/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group3/block2/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block2/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block1/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block1/conv2 input: [None, 256, None, None] | |
88%|████████▊ | 103562/118287 [00:28<00:04, 3530.61it/s][32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group3/block2/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn input: [None, 256, None, None],[None, 512, None, None],[None, 1024, None, None],[None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/lateral_1x1_c2 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block1/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block1/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/lateral_1x1_c2 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/lateral_1x1_c3 input: [None, 512, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/lateral_1x1_c3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/lateral_1x1_c4 input: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/lateral_1x1_c4 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/lateral_1x1_c5 input: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block1/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block2/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/lateral_1x1_c5 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/upsample_lat5 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/upsample_lat5 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/upsample_lat4 input: [None, 256, None, None] | |
88%|████████▊ | 103916/118287 [00:28<00:04, 3421.75it/s]Done batching roidbs | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block2/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block2/conv2 input: [None, 256, None, None] | |
Done batching roidbs | |
[32m[0308 18:58:49 @train.py:577][0m Total passes of the training set is: 24.56 | |
[32m[0308 18:58:49 @trainers.py:391][0m [HorovodTrainer] local rank=4 | |
[32m[0308 18:58:49 @train.py:577][0m Total passes of the training set is: 24.56 | |
[32m[0308 18:58:49 @trainers.py:391][0m [HorovodTrainer] local rank=6 | |
Done batching roidbs | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @train.py:577][0m Total passes of the training set is: 24.56 | |
[32m[0308 18:58:49 @trainers.py:391][0m [HorovodTrainer] local rank=2 | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/upsample_lat4 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/upsample_lat3 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @input_source.py:220][0m Setting up the queue 'QueueInput/input_queue' for CPU prefetching ... | |
[32m[0308 18:58:49 @input_source.py:220][0m Setting up the queue 'QueueInput/input_queue' for CPU prefetching ... | |
WARNING: Logging before flag parsing goes to stderr. | |
W0308 18:58:49.395869 140106948318976 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/moving_averages.py:210: calling Zeros.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
WARNING: Logging before flag parsing goes to stderr. | |
W0308 18:58:49.396498 140085798336256 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/moving_averages.py:210: calling Zeros.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:49 @input_source.py:220][0m Setting up the queue 'QueueInput/input_queue' for CPU prefetching ... | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block2/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block2/conv3 input: [None, 256, None, None] | |
WARNING: Logging before flag parsing goes to stderr. | |
W0308 18:58:49.407411 140134360606464 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/moving_averages.py:210: calling Zeros.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/upsample_lat3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/posthoc_3x3_p2 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/posthoc_3x3_p2 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/posthoc_3x3_p3 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m conv0 input: [None, 3, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m conv0 input: [None, 3, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/posthoc_3x3_p3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/posthoc_3x3_p4 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block2/conv3 output: [None, 1024, None, None] | |
W0308 18:58:49.444191 140106948318976 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:1253: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block3/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
W0308 18:58:49.446536 140085798336256 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:1253: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:49 @registry.py:125][0m conv0 input: [None, 3, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/posthoc_3x3_p4 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/posthoc_3x3_p5 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
W0308 18:58:49.459798 140134360606464 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:1253: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
88%|████████▊ | 104262/118287 [00:28<00:04, 3433.09it/s][32m[0308 18:58:49 @registry.py:133][0m fpn/posthoc_3x3_p5 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m fpn/maxpool_p6 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m conv0 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m fpn/maxpool_p6 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m pool0 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m pool0 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m conv0 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/conv1 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m fpn output: [None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m rpn input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m pool0 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m rpn/conv0 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m pool0 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/conv1 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block3/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block3/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m conv0 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m pool0 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m rpn/conv0 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m rpn/class input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m pool0 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/conv1 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m rpn/class output: [None, 3, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m rpn/box input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m rpn/box output: [None, 12, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m rpn output: [None, None, None, 3],[None, 12, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block3/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block3/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block3/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block4/conv1 input: [None, 1024, None, None] | |
88%|████████▊ | 104611/118287 [00:28<00:03, 3447.38it/s][32m[0308 18:58:49 @registry.py:133][0m group0/block0/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/convshortcut input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/convshortcut input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 0: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 0: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 0: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 0: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 1: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 1: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 1: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 1: (3, 4) | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 2: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 2: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 2: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 2: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 3: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 3: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 3: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 3: (3, 4) | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block0/convshortcut input: [None, 64, None, None] | |
[32m[0308 18:58:49 @data.py:335][0m Filtered 1021 images which contain no non-crowd groudtruth boxes. Total #images for training: 117266 | |
Batching roidbs | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 4: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 4: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 4: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 4: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (0): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (1): (?, 5) | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block4/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block4/conv2 input: [None, 256, None, None] | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (0): (?,) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (1): (?,) | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/convshortcut output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block1/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/convshortcut output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block1/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block0/convshortcut output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block1/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block4/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block4/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block1/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block1/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block1/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block1/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
89%|████████▊ | 104957/118287 [00:28<00:03, 3392.89it/s][32m[0308 18:58:49 @registry.py:133][0m group0/block1/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block1/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block4/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block5/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block1/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block1/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block1/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block1/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @data.py:335][0m Filtered 1021 images which contain no non-crowd groudtruth boxes. Total #images for training: 117266 | |
Batching roidbs | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block1/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block1/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block1/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block2/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block1/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block2/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block5/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block5/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block1/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block2/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block2/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block2/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block2/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block2/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block5/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group2/block5/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
89%|████████▉ | 105315/118287 [00:28<00:03, 3444.88it/s][32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block2/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block2/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block2/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block2/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block2/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block2/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group2/block5/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block0/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block2/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group0/block2/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block2/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block2/conv3 output: [None, 256, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group3/block0/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block0/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group0/block2/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
89%|████████▉ | 105661/118287 [00:28<00:03, 3319.92it/s][32m[0308 18:58:49 @registry.py:133][0m group3/block0/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block0/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group3/block0/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block0/convshortcut input: [None, 1024, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/convshortcut input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/convshortcut input: [None, 256, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block0/convshortcut input: [None, 256, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/convshortcut output: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group3/block0/convshortcut output: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block1/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:49 @registry.py:125][0m group3/block1/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:49 @registry.py:133][0m group1/block0/convshortcut output: [None, 512, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @registry.py:125][0m group1/block1/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
90%|████████▉ | 106023/118287 [00:28<00:03, 3401.54it/s][32m[0308 18:58:49 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block0/convshortcut output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block1/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block1/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block1/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block1/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block1/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group3/block1/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group3/block1/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block1/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block1/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block1/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block1/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block1/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block1/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group3/block1/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group3/block1/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block1/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block1/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block1/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block2/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block1/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block2/conv1 input: [None, 512, None, None] | |
90%|████████▉ | 106365/118287 [00:28<00:03, 3386.39it/s][32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group3/block1/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group3/block2/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block1/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block2/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block2/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block2/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block2/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block2/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group3/block2/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group3/block2/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block2/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block2/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block2/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block2/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block2/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block2/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group3/block2/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group3/block2/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block2/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block3/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block2/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block2/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block2/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block3/conv1 input: [None, 512, None, None] | |
90%|█████████ | 106705/118287 [00:29<00:03, 3326.37it/s][32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group3/block2/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn input: [None, 256, None, None],[None, 512, None, None],[None, 1024, None, None],[None, 2048, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/lateral_1x1_c2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block2/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block3/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block3/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block3/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/lateral_1x1_c2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/lateral_1x1_c3 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block3/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block3/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/lateral_1x1_c3 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/lateral_1x1_c4 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/lateral_1x1_c4 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/lateral_1x1_c5 input: [None, 2048, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block3/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block3/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block3/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block3/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block3/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block3/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/lateral_1x1_c5 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/upsample_lat5 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
91%|█████████ | 107071/118287 [00:29<00:03, 3419.12it/s][32m[0308 18:58:50 @registry.py:133][0m group1/block3/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block3/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group1/block3/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block3/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/upsample_lat5 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/upsample_lat4 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/upsample_lat4 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/upsample_lat3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group1/block3/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/upsample_lat3 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/posthoc_3x3_p2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/posthoc_3x3_p2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/posthoc_3x3_p3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/posthoc_3x3_p3 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/posthoc_3x3_p4 input: [None, 256, None, None] | |
91%|█████████ | 107440/118287 [00:29<00:03, 3492.53it/s][32m[0308 18:58:50 @registry.py:133][0m fpn/posthoc_3x3_p4 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/posthoc_3x3_p5 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/convshortcut input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/convshortcut input: [None, 512, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/posthoc_3x3_p5 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m fpn/maxpool_p6 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m fpn/maxpool_p6 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m fpn output: [None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m rpn input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m rpn/conv0 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m rpn/conv0 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m rpn/class input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/convshortcut output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block0/convshortcut input: [None, 512, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block1/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/convshortcut output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block1/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m rpn/class output: [None, 3, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m rpn/box input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m rpn/box output: [None, 12, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m rpn output: [None, None, None, 3],[None, 12, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block1/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block1/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block1/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block0/convshortcut output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block1/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block1/conv1 input: [None, 1024, None, None] | |
91%|█████████ | 107800/118287 [00:29<00:02, 3519.96it/s][32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block1/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block1/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block1/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block1/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block1/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block1/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 0: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 0: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 0: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 0: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 1: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 1: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 1: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 1: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 2: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 2: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 2: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 2: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 3: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 3: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 3: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 3: (3, 4) | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block1/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block2/conv1 input: [None, 1024, None, None] | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 4: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 4: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 4: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 4: (3, 4) | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block1/conv3 output: [None, 1024, None, None] | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (0): (?, 5) | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block2/conv1 input: [None, 1024, None, None] | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (1): (?, 5) | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block1/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block1/conv3 input: [None, 256, None, None] | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (0): (?,) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (1): (?,) | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
91%|█████████▏| 108156/118287 [00:29<00:02, 3527.10it/s][32m[0308 18:58:50 @registry.py:133][0m group2/block2/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block2/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block2/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block2/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block1/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block2/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block2/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block2/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block2/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block2/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block2/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block2/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block2/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block3/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block2/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block3/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block2/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block2/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
92%|█████████▏| 108524/118287 [00:29<00:02, 3563.59it/s][32m[0308 18:58:50 @registry.py:133][0m group2/block3/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block3/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block3/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block3/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block2/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block3/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block3/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block3/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block3/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block3/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block3/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block3/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block3/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block4/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block3/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block4/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block3/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block3/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
92%|█████████▏| 108881/118287 [00:29<00:02, 3449.40it/s][32m[0308 18:58:50 @registry.py:133][0m group2/block4/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block4/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block4/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block4/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block3/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block4/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block4/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block4/conv3 input: [None, 256, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block4/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block4/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block4/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block4/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block4/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block5/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block4/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block5/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block4/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block4/conv3 input: [None, 256, None, None] | |
92%|█████████▏| 109228/118287 [00:29<00:02, 3411.32it/s][32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block5/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block5/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block5/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block5/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block4/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block5/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block5/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block5/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block5/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block5/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:133][0m group2/block5/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:50 @registry.py:125][0m group2/block5/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:50 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group2/block5/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group2/block5/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/conv1 input: [None, 1024, None, None] | |
93%|█████████▎| 109571/118287 [00:29<00:02, 3330.74it/s][32m[0308 18:58:51 @registry.py:133][0m group2/block5/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group2/block5/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group2/block5/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/convshortcut input: [None, 1024, None, None] | |
93%|█████████▎| 109914/118287 [00:29<00:02, 3357.78it/s][32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/convshortcut input: [None, 1024, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/convshortcut output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block1/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/convshortcut output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block1/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block0/convshortcut input: [None, 1024, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block1/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block1/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block1/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block1/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block0/convshortcut output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block1/conv1 input: [None, 2048, None, None] | |
93%|█████████▎| 110261/118287 [00:30<00:02, 3389.73it/s][32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block1/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block1/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block1/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block1/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block1/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block1/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block1/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block2/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block1/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block2/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block1/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block1/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block2/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block2/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block2/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block2/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
94%|█████████▎| 110601/118287 [00:30<00:02, 3341.70it/s][32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block1/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block2/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block2/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block2/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block2/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block2/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block2/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block2/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block2/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn input: [None, 256, None, None],[None, 512, None, None],[None, 1024, None, None],[None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c2 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block2/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn input: [None, 256, None, None],[None, 512, None, None],[None, 1024, None, None],[None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c2 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c2 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block2/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m group3/block2/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c2 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c3 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c4 input: [None, 1024, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c3 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c4 input: [None, 1024, None, None] | |
94%|█████████▍| 110936/118287 [00:30<00:02, 3302.57it/s][32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c4 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c5 input: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c4 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c5 input: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m group3/block2/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn input: [None, 256, None, None],[None, 512, None, None],[None, 1024, None, None],[None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c2 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c5 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/upsample_lat5 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c5 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/upsample_lat5 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c2 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c3 input: [None, 512, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/upsample_lat5 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c3 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c4 input: [None, 1024, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/upsample_lat4 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/upsample_lat5 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/upsample_lat4 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c4 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/lateral_1x1_c5 input: [None, 2048, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/upsample_lat4 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/upsample_lat3 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/upsample_lat4 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/lateral_1x1_c5 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/upsample_lat5 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/upsample_lat3 input: [None, 256, None, None] | |
94%|█████████▍| 111294/118287 [00:30<00:02, 3380.89it/s][32m[0308 18:58:51 @registry.py:133][0m fpn/upsample_lat3 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/upsample_lat5 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/upsample_lat3 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p2 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/upsample_lat4 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p2 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p2 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p3 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p2 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p3 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/upsample_lat4 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/upsample_lat3 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p3 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p4 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p3 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p4 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p4 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p5 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p4 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p5 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/upsample_lat3 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p2 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p5 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/maxpool_p6 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p5 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/maxpool_p6 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/maxpool_p6 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/maxpool_p6 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn output: [None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn output: [None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn/conv0 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn/conv0 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p2 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p3 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn/conv0 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn/class input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn/conv0 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn/class input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p3 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p4 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn/class output: [None, 3, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn/box input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn/class output: [None, 3, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn/box input: [None, 256, None, None] | |
94%|█████████▍| 111639/118287 [00:30<00:01, 3401.16it/s][32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p4 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/posthoc_3x3_p5 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn/box output: [None, 12, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn/box output: [None, 12, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn output: [None, None, None, 3],[None, 12, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn output: [None, None, None, 3],[None, 12, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/posthoc_3x3_p5 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m fpn/maxpool_p6 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn/maxpool_p6 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m fpn output: [None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn/conv0 input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn/conv0 output: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn/class input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn/class output: [None, 3, None, None] | |
[32m[0308 18:58:51 @registry.py:125][0m rpn/box input: [None, 256, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn/box output: [None, 12, None, None] | |
[32m[0308 18:58:51 @registry.py:133][0m rpn output: [None, None, None, 3],[None, 12, None, None] | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 0: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 0: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 0: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 0: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 0: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 0: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 0: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 0: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 1: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 1: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 1: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 1: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 1: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 1: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 1: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 1: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 2: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 2: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 2: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 2: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 2: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 2: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 2: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 2: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 3: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 3: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 3: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 3: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 3: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 3: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 3: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 3: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 4: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 4: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 4: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 4: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 4: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 4: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 4: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 4: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (0): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (0): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (1): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (1): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (0): (?,) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (0): (?,) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (1): (?,) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (1): (?,) | |
95%|█████████▍| 111980/118287 [00:30<00:01, 3371.31it/s][buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 0: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 0: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 0: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 0: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 1: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 1: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 1: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 1: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 2: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 2: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 2: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 2: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 3: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 3: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 3: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 3: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 4: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 4: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 4: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 4: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (0): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (1): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (0): (?,) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (1): (?,) | |
95%|█████████▍| 112318/118287 [00:30<00:01, 3371.77it/s] | |
95%|█████████▌| 112656/118287 [00:30<00:01, 3343.02it/s][tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
96%|█████████▌| 112991/118287 [00:30<00:01, 3126.17it/s] | |
96%|█████████▌| 113328/118287 [00:31<00:01, 3194.63it/s] | |
96%|█████████▌| 113650/118287 [00:31<00:01, 3104.24it/s] | |
96%|█████████▋| 113967/118287 [00:31<00:01, 3121.16it/s] | |
97%|█████████▋| 114281/118287 [00:31<00:01, 3098.78it/s]Done batching roidbs | |
[32m[0308 18:58:52 @train.py:577][0m Total passes of the training set is: 24.56 | |
97%|█████████▋| 114613/118287 [00:31<00:01, 3161.18it/s]Done batching roidbs | |
[32m[0308 18:58:52 @train.py:577][0m Total passes of the training set is: 24.56 | |
[32m[0308 18:58:52 @trainers.py:391][0m [HorovodTrainer] local rank=7 | |
[32m[0308 18:58:52 @input_source.py:220][0m Setting up the queue 'QueueInput/input_queue' for CPU prefetching ... | |
WARNING: Logging before flag parsing goes to stderr. | |
W0308 18:58:52.630372 139814253213440 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/moving_averages.py:210: calling Zeros.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:52 @registry.py:125][0m conv0 input: [None, 3, None, None] | |
97%|█████████▋| 114931/118287 [00:31<00:01, 3156.51it/s][32m[0308 18:58:52 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
W0308 18:58:52.687193 139814253213440 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:1253: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:52 @registry.py:133][0m conv0 output: [None, 64, None, None] | |
[32m[0308 18:58:52 @registry.py:125][0m pool0 input: [None, 64, None, None] | |
[32m[0308 18:58:52 @registry.py:133][0m pool0 output: [None, 64, None, None] | |
[32m[0308 18:58:52 @registry.py:125][0m group0/block0/conv1 input: [None, 64, None, None] | |
[32m[0308 18:58:52 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:52 @registry.py:133][0m group0/block0/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:52 @registry.py:125][0m group0/block0/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:52 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
97%|█████████▋| 115248/118287 [00:31<00:00, 3142.99it/s][32m[0308 18:58:52 @registry.py:133][0m group0/block0/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:52 @registry.py:125][0m group0/block0/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:52 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:52 @registry.py:133][0m group0/block0/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:52 @registry.py:125][0m group0/block0/convshortcut input: [None, 64, None, None] | |
[32m[0308 18:58:52 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:52 @registry.py:133][0m group0/block0/convshortcut output: [None, 256, None, None] | |
[32m[0308 18:58:52 @registry.py:125][0m group0/block1/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:52 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
98%|█████████▊| 115572/118287 [00:31<00:00, 3171.39it/s][32m[0308 18:58:52 @registry.py:133][0m group0/block1/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:52 @registry.py:125][0m group0/block1/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:52 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:52 @registry.py:133][0m group0/block1/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:52 @registry.py:125][0m group0/block1/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:52 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:52 @registry.py:133][0m group0/block1/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:52 @registry.py:125][0m group0/block2/conv1 input: [None, 256, None, None] | |
98%|█████████▊| 115931/118287 [00:31<00:00, 3286.26it/s][32m[0308 18:58:52 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block2/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block2/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block2/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block2/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block2/conv3 output: [None, 256, None, None] | |
98%|█████████▊| 116269/118287 [00:31<00:00, 3313.50it/s][32m[0308 18:58:53 @registry.py:125][0m group1/block0/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block0/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block0/conv2 input: [None, 128, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block0/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block0/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
99%|█████████▊| 116610/118287 [00:32<00:00, 3340.74it/s][32m[0308 18:58:53 @registry.py:133][0m group1/block0/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block0/convshortcut input: [None, 256, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block0/convshortcut output: [None, 512, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block1/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block1/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block1/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
99%|█████████▉| 116949/118287 [00:32<00:00, 3353.79it/s][32m[0308 18:58:53 @trainers.py:391][0m [HorovodTrainer] local rank=0 | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block1/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block1/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @input_source.py:220][0m Setting up the queue 'QueueInput/input_queue' for CPU prefetching ... | |
WARNING: Logging before flag parsing goes to stderr. | |
W0308 18:58:53.322254 139697895950080 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/moving_averages.py:210: calling Zeros.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block1/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block2/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:125][0m conv0 input: [None, 3, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block2/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block2/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
W0308 18:58:53.377158 139697895950080 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:1253: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
99%|█████████▉| 117285/118287 [00:32<00:00, 3306.30it/s][32m[0308 18:58:53 @registry.py:133][0m group1/block2/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block2/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m conv0 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m pool0 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m pool0 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block0/conv1 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block2/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block3/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block0/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block0/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block3/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block3/conv2 input: [None, 128, None, None] | |
99%|█████████▉| 117617/118287 [00:32<00:00, 3302.53it/s][32m[0308 18:58:53 @registry.py:133][0m group0/block0/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block0/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block3/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block3/conv3 input: [None, 128, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block0/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block0/convshortcut input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block3/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block0/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block0/convshortcut output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block1/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block0/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block0/conv2 input: [None, 256, None, None] | |
100%|█████████▉| 117948/118287 [00:32<00:00, 3191.07it/s][32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block1/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block1/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block0/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block0/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block1/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block1/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block0/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block0/convshortcut input: [None, 512, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block1/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block2/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
100%|█████████▉| 118269/118287 [00:32<00:00, 3157.02it/s][32m[0308 18:58:53 @registry.py:133][0m group2/block0/convshortcut output: [None, 1024, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block1/conv1 input: [None, 1024, None, None] | |
100%|██████████| 118287/118287 [00:32<00:00, 3634.09it/s][32m[0308 18:58:53 @timer.py:48][0m Load Groundtruth Boxes for train2017 finished, time:32.6350sec. | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block2/conv1 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block2/conv2 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block1/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block1/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block2/conv2 output: [None, 64, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group0/block2/conv3 input: [None, 64, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block1/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block1/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m group0/block2/conv3 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block0/conv1 input: [None, 256, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block1/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block2/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block0/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block0/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block2/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block2/conv2 input: [None, 256, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block0/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block0/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block2/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block2/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block0/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block0/convshortcut input: [None, 256, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block2/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block3/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block0/convshortcut output: [None, 512, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block1/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @registry.py:133][0m group2/block3/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group2/block3/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:53 @registry.py:133][0m group1/block1/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:53 @registry.py:125][0m group1/block1/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:53 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block3/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block3/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group1/block1/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group1/block1/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block3/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block4/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group1/block1/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group1/block2/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block4/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block4/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group1/block2/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group1/block2/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block4/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block4/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group1/block2/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group1/block2/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block4/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block5/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group1/block2/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group1/block3/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block5/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block5/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group1/block3/conv1 output: [None, 128, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group1/block3/conv2 input: [None, 128, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block5/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block5/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group1/block3/conv2 output: [None, 128, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group1/block3/conv3 input: [None, 128, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block5/conv3 output: [None, 1024, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block0/conv1 input: [None, 1024, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:54 @registry.py:133][0m group1/block3/conv3 output: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block0/conv1 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block0/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block0/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block0/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block0/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block0/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block0/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block0/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block0/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block0/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block0/convshortcut input: [None, 1024, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block0/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block0/convshortcut input: [None, 512, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block0/convshortcut output: [None, 2048, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block1/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block0/convshortcut output: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block1/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block1/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block1/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block1/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block1/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block1/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block1/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block1/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block1/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block1/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block2/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block1/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block2/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block2/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block2/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block2/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block2/conv2 input: [None, 256, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block2/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group3/block2/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block2/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block2/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group3/block2/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn input: [None, 256, None, None],[None, 512, None, None],[None, 1024, None, None],[None, 2048, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/lateral_1x1_c2 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/lateral_1x1_c2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/lateral_1x1_c3 input: [None, 512, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block2/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block3/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/lateral_1x1_c3 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/lateral_1x1_c4 input: [None, 1024, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/lateral_1x1_c4 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/lateral_1x1_c5 input: [None, 2048, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/lateral_1x1_c5 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/upsample_lat5 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block3/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block3/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/upsample_lat5 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/upsample_lat4 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block3/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block3/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/upsample_lat4 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/upsample_lat3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/upsample_lat3 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/posthoc_3x3_p2 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block3/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block4/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/posthoc_3x3_p2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/posthoc_3x3_p3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/posthoc_3x3_p3 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/posthoc_3x3_p4 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/posthoc_3x3_p4 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/posthoc_3x3_p5 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block4/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block4/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/posthoc_3x3_p5 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m fpn/maxpool_p6 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m fpn/maxpool_p6 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m fpn output: [None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m rpn input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m rpn/conv0 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m rpn/conv0 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m rpn/class input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block4/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block4/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m rpn/class output: [None, 3, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m rpn/box input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m rpn/box output: [None, 12, None, None] | |
[32m[0308 18:58:54 @registry.py:133][0m rpn output: [None, None, None, 3],[None, 12, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block4/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block5/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block5/conv1 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block5/conv2 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 0: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 0: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 0: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 0: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 1: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 1: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 1: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 1: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 2: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 2: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 2: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 2: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 3: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 3: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 3: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 3: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 4: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 4: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 4: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 4: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (0): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (1): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (0): (?,) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (1): (?,) | |
[32m[0308 18:58:54 @registry.py:133][0m group2/block5/conv2 output: [None, 256, None, None] | |
[32m[0308 18:58:54 @registry.py:125][0m group2/block5/conv3 input: [None, 256, None, None] | |
[32m[0308 18:58:54 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:55 @registry.py:133][0m group2/block5/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block0/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block0/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block0/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block0/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block0/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block0/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block0/convshortcut input: [None, 1024, None, None] | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block0/convshortcut output: [None, 2048, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block1/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [proposal_metrics_batch] mean_of_mean_best_iou: () | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block1/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block1/conv2 input: [None, 512, None, None] | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block1/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block1/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block1/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block2/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=0: (?, 5) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block2/conv1 output: [None, 512, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block2/conv2 input: [None, 512, None, None] | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=1: (?,) | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block2/conv2 output: [None, 512, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m group3/block2/conv3 input: [None, 512, None, None] | |
[32m[0308 18:58:55 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:58:55 @registry.py:133][0m group3/block2/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn input: [None, 256, None, None],[None, 512, None, None],[None, 1024, None, None],[None, 2048, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/lateral_1x1_c2 input: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/lateral_1x1_c2 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/lateral_1x1_c3 input: [None, 512, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/lateral_1x1_c3 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/lateral_1x1_c4 input: [None, 1024, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/lateral_1x1_c4 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/lateral_1x1_c5 input: [None, 2048, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/lateral_1x1_c5 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/upsample_lat5 input: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fastrcnn input: [None, 256, 7, 7] | |
[32m[0308 18:58:55 @registry.py:125][0m fastrcnn/fc6 input: [None, 256, 7, 7] | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/upsample_lat5 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/upsample_lat4 input: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m fastrcnn/fc6 output: [None, 1024] | |
[32m[0308 18:58:55 @registry.py:125][0m fastrcnn/fc7 input: [None, 1024] | |
[32m[0308 18:58:55 @registry.py:133][0m fastrcnn/fc7 output: [None, 1024] | |
[32m[0308 18:58:55 @registry.py:133][0m fastrcnn output: [None, 1024] | |
[buildtime_shape] [train.roi_heads] head_feature: (?, 1024) | |
[32m[0308 18:58:55 @registry.py:125][0m fastrcnn/outputs input: [None, 1024] | |
[32m[0308 18:58:55 @registry.py:125][0m fastrcnn/outputs/class input: [None, 1024] | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/upsample_lat4 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/upsample_lat3 input: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m fastrcnn/outputs/class output: [None, 81] | |
[32m[0308 18:58:55 @registry.py:125][0m fastrcnn/outputs/box input: [None, 1024] | |
[32m[0308 18:58:55 @registry.py:133][0m fastrcnn/outputs/box output: [None, 324] | |
[32m[0308 18:58:55 @registry.py:133][0m fastrcnn/outputs output: [None, 81],[None, 81, 4] | |
self.training == True | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/upsample_lat3 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/posthoc_3x3_p2 input: [None, 256, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/posthoc_3x3_p2 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/posthoc_3x3_p3 input: [None, 256, None, None] | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/posthoc_3x3_p3 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/posthoc_3x3_p4 input: [None, 256, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/posthoc_3x3_p4 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/posthoc_3x3_p5 input: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/posthoc_3x3_p5 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m fpn/maxpool_p6 input: [None, 256, None, None] | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:55 @registry.py:133][0m fpn/maxpool_p6 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m fpn output: [None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m rpn input: [None, 256, None, None] | |
labels Tensor("concat:0", shape=(?,), dtype=int64) | |
label_logits Tensor("concat_1:0", shape=(?, 81), dtype=float32) | |
fg_boxes Tensor("concat_2:0", shape=(?, 4), dtype=float32) | |
fg_box_logits Tensor("concat_3:0", shape=(?, 81, 4), dtype=float32) | |
[32m[0308 18:58:55 @registry.py:125][0m rpn/conv0 input: [None, 256, None, None] | |
Done loading roidbs | |
[32m[0308 18:58:55 @registry.py:133][0m rpn/conv0 output: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m rpn/class input: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m rpn/class output: [None, 3, None, None] | |
[32m[0308 18:58:55 @registry.py:125][0m rpn/box input: [None, 256, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m rpn/box output: [None, 12, None, None] | |
[32m[0308 18:58:55 @registry.py:133][0m rpn output: [None, None, None, 3],[None, 12, None, None] | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 0: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 0: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 0: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 0: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 1: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 1: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 1: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 1: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 2: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 2: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 2: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 2: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 3: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 3: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 3: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 3: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 4: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 4: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 4: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 4: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (0): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (1): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (0): (?,) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (1): (?,) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:55 @registry.py:125][0m maskrcnn input: [None, 256, 14, 14] | |
[32m[0308 18:58:55 @registry.py:125][0m maskrcnn/fcn0 input: [None, 256, 14, 14] | |
[32m[0308 18:58:55 @registry.py:133][0m maskrcnn/fcn0 output: [None, 256, 14, 14] | |
[32m[0308 18:58:55 @registry.py:125][0m maskrcnn/fcn1 input: [None, 256, 14, 14] | |
[32m[0308 18:58:55 @registry.py:133][0m maskrcnn/fcn1 output: [None, 256, 14, 14] | |
[32m[0308 18:58:55 @registry.py:125][0m maskrcnn/fcn2 input: [None, 256, 14, 14] | |
[32m[0308 18:58:56 @registry.py:133][0m maskrcnn/fcn2 output: [None, 256, 14, 14] | |
[32m[0308 18:58:56 @registry.py:125][0m maskrcnn/fcn3 input: [None, 256, 14, 14] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:56 @registry.py:133][0m maskrcnn/fcn3 output: [None, 256, 14, 14] | |
[32m[0308 18:58:56 @registry.py:125][0m maskrcnn/deconv input: [None, 256, 14, 14] | |
[32m[0308 18:58:56 @registry.py:133][0m maskrcnn/deconv output: [None, 256, 28, 28] | |
[32m[0308 18:58:56 @registry.py:125][0m maskrcnn/conv input: [None, 256, 28, 28] | |
[32m[0308 18:58:56 @registry.py:133][0m maskrcnn/conv output: [None, 80, 28, 28] | |
[32m[0308 18:58:56 @registry.py:133][0m maskrcnn output: [None, 80, 28, 28] | |
W0308 18:58:56.100846 140655740700416 deprecation.py:506] From /home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/model_box.py:215: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. | |
Instructions for updating: | |
box_ind is deprecated, use box_indices instead | |
[buildtime_shape] [roi_heads, batch_idx 0] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [roi_heads, batch_idx 1] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] mask_logits: (?, 80, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] fg_labels: (?,) | |
[buildtime_shape] [maskrcnn_loss] fg_target_masks: (?, 28, 28) | |
[buildtime_shape] [proposal_metrics_batch] mean_of_mean_best_iou: () | |
[32m[0308 18:58:56 @data.py:335][0m Filtered 1021 images which contain no non-crowd groudtruth boxes. Total #images for training: 117266 | |
Batching roidbs | |
[32m[0308 18:58:56 @regularize.py:95][0m regularize_cost() found 63 variables to regularize. | |
[32m[0308 18:58:56 @regularize.py:20][0m The following tensors will be regularized: group1/block0/conv1/W:0, group1/block0/conv2/W:0, group1/block0/conv3/W:0, group1/block0/convshortcut/W:0, group1/block1/conv1/W:0, group1/block1/conv2/W:0, group1/block1/conv3/W:0, group1/block2/conv1/W:0, group1/block2/conv2/W:0, group1/block2/conv3/W:0, group1/block3/conv1/W:0, group1/block3/conv2/W:0, group1/block3/conv3/W:0, group2/block0/conv1/W:0, group2/block0/conv2/W:0, group2/block0/conv3/W:0, group2/block0/convshortcut/W:0, group2/block1/conv1/W:0, group2/block1/conv2/W:0, group2/block1/conv3/W:0, group2/block2/conv1/W:0, group2/block2/conv2/W:0, group2/block2/conv3/W:0, group2/block3/conv1/W:0, group2/block3/conv2/W:0, group2/block3/conv3/W:0, group2/block4/conv1/W:0, group2/block4/conv2/W:0, group2/block4/conv3/W:0, group2/block5/conv1/W:0, group2/block5/conv2/W:0, group2/block5/conv3/W:0, group3/block0/conv1/W:0, group3/block0/conv2/W:0, group3/block0/conv3/W:0, group3/block0/convshortcut/W:0, group3/block1/conv1/W:0, group3/block1/conv2/W:0, group3/block1/conv3/W:0, group3/block2/conv1/W:0, group3/block2/conv2/W:0, group3/block2/conv3/W:0, fpn/lateral_1x1_c2/W:0, fpn/lateral_1x1_c3/W:0, fpn/lateral_1x1_c4/W:0, fpn/lateral_1x1_c5/W:0, fpn/posthoc_3x3_p2/W:0, fpn/posthoc_3x3_p3/W:0, fpn/posthoc_3x3_p4/W:0, fpn/posthoc_3x3_p5/W:0, rpn/conv0/W:0, rpn/class/W:0, rpn/box/W:0, fastrcnn/fc6/W:0, fastrcnn/fc7/W:0, fastrcnn/outputs/class/W:0, fastrcnn/outputs/box/W:0, maskrcnn/fcn0/W:0, maskrcnn/fcn1/W:0, maskrcnn/fcn2/W:0, maskrcnn/fcn3/W:0, maskrcnn/deconv/W:0, maskrcnn/conv/W:0 | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:58:56 @registry.py:125][0m fastrcnn input: [None, 256, 7, 7] | |
[32m[0308 18:58:56 @registry.py:125][0m fastrcnn/fc6 input: [None, 256, 7, 7] | |
[32m[0308 18:58:56 @registry.py:133][0m fastrcnn/fc6 output: [None, 1024] | |
[32m[0308 18:58:56 @registry.py:125][0m fastrcnn/fc7 input: [None, 1024] | |
[32m[0308 18:58:56 @registry.py:133][0m fastrcnn/fc7 output: [None, 1024] | |
[32m[0308 18:58:56 @registry.py:133][0m fastrcnn output: [None, 1024] | |
[buildtime_shape] [train.roi_heads] head_feature: (?, 1024) | |
[32m[0308 18:58:56 @registry.py:125][0m fastrcnn/outputs input: [None, 1024] | |
[32m[0308 18:58:56 @registry.py:125][0m fastrcnn/outputs/class input: [None, 1024] | |
[32m[0308 18:58:56 @registry.py:133][0m fastrcnn/outputs/class output: [None, 81] | |
[32m[0308 18:58:56 @registry.py:125][0m fastrcnn/outputs/box input: [None, 1024] | |
[32m[0308 18:58:56 @registry.py:133][0m fastrcnn/outputs/box output: [None, 324] | |
[32m[0308 18:58:56 @registry.py:133][0m fastrcnn/outputs output: [None, 81],[None, 81, 4] | |
self.training == True | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
W0308 18:58:56.706800 140655740700416 deprecation.py:323] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/array_grad.py:425: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Use tf.cast instead. | |
labels Tensor("concat:0", shape=(?,), dtype=int64) | |
label_logits Tensor("concat_1:0", shape=(?, 81), dtype=float32) | |
fg_boxes Tensor("concat_2:0", shape=(?, 4), dtype=float32) | |
fg_box_logits Tensor("concat_3:0", shape=(?, 81, 4), dtype=float32) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:58:57 @registry.py:125][0m maskrcnn input: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:125][0m maskrcnn/fcn0 input: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:133][0m maskrcnn/fcn0 output: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:125][0m maskrcnn/fcn1 input: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:133][0m maskrcnn/fcn1 output: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:125][0m maskrcnn/fcn2 input: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:133][0m maskrcnn/fcn2 output: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:125][0m maskrcnn/fcn3 input: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:133][0m maskrcnn/fcn3 output: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:125][0m maskrcnn/deconv input: [None, 256, 14, 14] | |
[32m[0308 18:58:57 @registry.py:133][0m maskrcnn/deconv output: [None, 256, 28, 28] | |
[32m[0308 18:58:57 @registry.py:125][0m maskrcnn/conv input: [None, 256, 28, 28] | |
[32m[0308 18:58:57 @registry.py:133][0m maskrcnn/conv output: [None, 80, 28, 28] | |
[32m[0308 18:58:57 @registry.py:133][0m maskrcnn output: [None, 80, 28, 28] | |
W0308 18:58:57.173261 139695895922432 deprecation.py:506] From /home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/model_box.py:215: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. | |
Instructions for updating: | |
box_ind is deprecated, use box_indices instead | |
[buildtime_shape] [roi_heads, batch_idx 0] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [roi_heads, batch_idx 1] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] mask_logits: (?, 80, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] fg_labels: (?,) | |
[buildtime_shape] [maskrcnn_loss] fg_target_masks: (?, 28, 28) | |
[buildtime_shape] [proposal_metrics_batch] mean_of_mean_best_iou: () | |
[32m[0308 18:58:57 @regularize.py:95][0m regularize_cost() found 63 variables to regularize. | |
[32m[0308 18:58:57 @regularize.py:20][0m The following tensors will be regularized: group1/block0/conv1/W:0, group1/block0/conv2/W:0, group1/block0/conv3/W:0, group1/block0/convshortcut/W:0, group1/block1/conv1/W:0, group1/block1/conv2/W:0, group1/block1/conv3/W:0, group1/block2/conv1/W:0, group1/block2/conv2/W:0, group1/block2/conv3/W:0, group1/block3/conv1/W:0, group1/block3/conv2/W:0, group1/block3/conv3/W:0, group2/block0/conv1/W:0, group2/block0/conv2/W:0, group2/block0/conv3/W:0, group2/block0/convshortcut/W:0, group2/block1/conv1/W:0, group2/block1/conv2/W:0, group2/block1/conv3/W:0, group2/block2/conv1/W:0, group2/block2/conv2/W:0, group2/block2/conv3/W:0, group2/block3/conv1/W:0, group2/block3/conv2/W:0, group2/block3/conv3/W:0, group2/block4/conv1/W:0, group2/block4/conv2/W:0, group2/block4/conv3/W:0, group2/block5/conv1/W:0, group2/block5/conv2/W:0, group2/block5/conv3/W:0, group3/block0/conv1/W:0, group3/block0/conv2/W:0, group3/block0/conv3/W:0, group3/block0/convshortcut/W:0, group3/block1/conv1/W:0, group3/block1/conv2/W:0, group3/block1/conv3/W:0, group3/block2/conv1/W:0, group3/block2/conv2/W:0, group3/block2/conv3/W:0, fpn/lateral_1x1_c2/W:0, fpn/lateral_1x1_c3/W:0, fpn/lateral_1x1_c4/W:0, fpn/lateral_1x1_c5/W:0, fpn/posthoc_3x3_p2/W:0, fpn/posthoc_3x3_p3/W:0, fpn/posthoc_3x3_p4/W:0, fpn/posthoc_3x3_p5/W:0, rpn/conv0/W:0, rpn/class/W:0, rpn/box/W:0, fastrcnn/fc6/W:0, fastrcnn/fc7/W:0, fastrcnn/outputs/class/W:0, fastrcnn/outputs/box/W:0, maskrcnn/fcn0/W:0, maskrcnn/fcn1/W:0, maskrcnn/fcn2/W:0, maskrcnn/fcn3/W:0, maskrcnn/deconv/W:0, maskrcnn/conv/W:0 | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:58:57 @registry.py:125][0m fastrcnn input: [None, 256, 7, 7] | |
[32m[0308 18:58:57 @registry.py:125][0m fastrcnn/fc6 input: [None, 256, 7, 7] | |
[32m[0308 18:58:57 @registry.py:133][0m fastrcnn/fc6 output: [None, 1024] | |
[32m[0308 18:58:57 @registry.py:125][0m fastrcnn/fc7 input: [None, 1024] | |
[buildtime_shape] [proposal_metrics_batch] mean_of_mean_best_iou: () | |
[32m[0308 18:58:57 @registry.py:133][0m fastrcnn/fc7 output: [None, 1024] | |
[32m[0308 18:58:57 @registry.py:133][0m fastrcnn output: [None, 1024] | |
[buildtime_shape] [train.roi_heads] head_feature: (?, 1024) | |
[32m[0308 18:58:57 @registry.py:125][0m fastrcnn/outputs input: [None, 1024] | |
[32m[0308 18:58:57 @registry.py:125][0m fastrcnn/outputs/class input: [None, 1024] | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:57 @registry.py:133][0m fastrcnn/outputs/class output: [None, 81] | |
[32m[0308 18:58:57 @registry.py:125][0m fastrcnn/outputs/box input: [None, 1024] | |
[32m[0308 18:58:57 @registry.py:133][0m fastrcnn/outputs/box output: [None, 324] | |
[32m[0308 18:58:57 @registry.py:133][0m fastrcnn/outputs output: [None, 81],[None, 81, 4] | |
self.training == True | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[buildtime_shape] [proposal_metrics_batch] mean_of_mean_best_iou: () | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
labels Tensor("concat:0", shape=(?,), dtype=int64) | |
label_logits Tensor("concat_1:0", shape=(?, 81), dtype=float32) | |
fg_boxes Tensor("concat_2:0", shape=(?, 4), dtype=float32) | |
fg_box_logits Tensor("concat_3:0", shape=(?, 81, 4), dtype=float32) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=0: (?, 5) | |
W0308 18:58:57.778473 139695895922432 deprecation.py:323] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/array_grad.py:425: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Use tf.cast instead. | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:58:57 @registry.py:125][0m fastrcnn input: [None, 256, 7, 7] | |
[32m[0308 18:58:57 @registry.py:125][0m fastrcnn/fc6 input: [None, 256, 7, 7] | |
[32m[0308 18:58:57 @registry.py:133][0m fastrcnn/fc6 output: [None, 1024] | |
[32m[0308 18:58:57 @registry.py:125][0m fastrcnn/fc7 input: [None, 1024] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn/fc7 output: [None, 1024] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn output: [None, 1024] | |
[buildtime_shape] [train.roi_heads] head_feature: (?, 1024) | |
[32m[0308 18:58:58 @registry.py:125][0m fastrcnn/outputs input: [None, 1024] | |
[32m[0308 18:58:58 @registry.py:125][0m fastrcnn/outputs/class input: [None, 1024] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn/outputs/class output: [None, 81] | |
[32m[0308 18:58:58 @registry.py:125][0m fastrcnn/outputs/box input: [None, 1024] | |
[32m[0308 18:58:58 @registry.py:125][0m fastrcnn input: [None, 256, 7, 7] | |
[32m[0308 18:58:58 @registry.py:125][0m fastrcnn/fc6 input: [None, 256, 7, 7] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn/outputs/box output: [None, 324] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn/outputs output: [None, 81],[None, 81, 4] | |
self.training == True | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn/fc6 output: [None, 1024] | |
[32m[0308 18:58:58 @registry.py:125][0m fastrcnn/fc7 input: [None, 1024] | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn0 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn/fc7 output: [None, 1024] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn output: [None, 1024] | |
[buildtime_shape] [train.roi_heads] head_feature: (?, 1024) | |
[32m[0308 18:58:58 @registry.py:125][0m fastrcnn/outputs input: [None, 1024] | |
[32m[0308 18:58:58 @registry.py:125][0m fastrcnn/outputs/class input: [None, 1024] | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn0 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn1 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn/outputs/class output: [None, 81] | |
[32m[0308 18:58:58 @registry.py:125][0m fastrcnn/outputs/box input: [None, 1024] | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn1 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn2 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn/outputs/box output: [None, 324] | |
[32m[0308 18:58:58 @registry.py:133][0m fastrcnn/outputs output: [None, 81],[None, 81, 4] | |
self.training == True | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn2 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn3 input: [None, 256, 14, 14] | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
labels Tensor("concat:0", shape=(?,), dtype=int64) | |
label_logits Tensor("concat_1:0", shape=(?, 81), dtype=float32) | |
fg_boxes Tensor("concat_2:0", shape=(?, 4), dtype=float32) | |
fg_box_logits Tensor("concat_3:0", shape=(?, 81, 4), dtype=float32) | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn3 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/deconv input: [None, 256, 14, 14] | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/deconv output: [None, 256, 28, 28] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/conv input: [None, 256, 28, 28] | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/conv output: [None, 80, 28, 28] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn output: [None, 80, 28, 28] | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
labels Tensor("concat:0", shape=(?,), dtype=int64) | |
label_logits Tensor("concat_1:0", shape=(?, 81), dtype=float32) | |
fg_boxes Tensor("concat_2:0", shape=(?, 4), dtype=float32) | |
fg_box_logits Tensor("concat_3:0", shape=(?, 81, 4), dtype=float32) | |
W0308 18:58:58.197804 140085798336256 deprecation.py:506] From /home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/model_box.py:215: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. | |
Instructions for updating: | |
box_ind is deprecated, use box_indices instead | |
[buildtime_shape] [roi_heads, batch_idx 0] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [roi_heads, batch_idx 1] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] mask_logits: (?, 80, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] fg_labels: (?,) | |
[buildtime_shape] [maskrcnn_loss] fg_target_masks: (?, 28, 28) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn0 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @regularize.py:95][0m regularize_cost() found 63 variables to regularize. | |
[32m[0308 18:58:58 @regularize.py:20][0m The following tensors will be regularized: group1/block0/conv1/W:0, group1/block0/conv2/W:0, group1/block0/conv3/W:0, group1/block0/convshortcut/W:0, group1/block1/conv1/W:0, group1/block1/conv2/W:0, group1/block1/conv3/W:0, group1/block2/conv1/W:0, group1/block2/conv2/W:0, group1/block2/conv3/W:0, group1/block3/conv1/W:0, group1/block3/conv2/W:0, group1/block3/conv3/W:0, group2/block0/conv1/W:0, group2/block0/conv2/W:0, group2/block0/conv3/W:0, group2/block0/convshortcut/W:0, group2/block1/conv1/W:0, group2/block1/conv2/W:0, group2/block1/conv3/W:0, group2/block2/conv1/W:0, group2/block2/conv2/W:0, group2/block2/conv3/W:0, group2/block3/conv1/W:0, group2/block3/conv2/W:0, group2/block3/conv3/W:0, group2/block4/conv1/W:0, group2/block4/conv2/W:0, group2/block4/conv3/W:0, group2/block5/conv1/W:0, group2/block5/conv2/W:0, group2/block5/conv3/W:0, group3/block0/conv1/W:0, group3/block0/conv2/W:0, group3/block0/conv3/W:0, group3/block0/convshortcut/W:0, group3/block1/conv1/W:0, group3/block1/conv2/W:0, group3/block1/conv3/W:0, group3/block2/conv1/W:0, group3/block2/conv2/W:0, group3/block2/conv3/W:0, fpn/lateral_1x1_c2/W:0, fpn/lateral_1x1_c3/W:0, fpn/lateral_1x1_c4/W:0, fpn/lateral_1x1_c5/W:0, fpn/posthoc_3x3_p2/W:0, fpn/posthoc_3x3_p3/W:0, fpn/posthoc_3x3_p4/W:0, fpn/posthoc_3x3_p5/W:0, rpn/conv0/W:0, rpn/class/W:0, rpn/box/W:0, fastrcnn/fc6/W:0, fastrcnn/fc7/W:0, fastrcnn/outputs/class/W:0, fastrcnn/outputs/box/W:0, maskrcnn/fcn0/W:0, maskrcnn/fcn1/W:0, maskrcnn/fcn2/W:0, maskrcnn/fcn3/W:0, maskrcnn/deconv/W:0, maskrcnn/conv/W:0 | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn0 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn1 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn1 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn2 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn2 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn3 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn0 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn3 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/deconv input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn0 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn1 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn1 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn2 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/deconv output: [None, 256, 28, 28] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/conv input: [None, 256, 28, 28] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn2 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/fcn3 input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/conv output: [None, 80, 28, 28] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn output: [None, 80, 28, 28] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/fcn3 output: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/deconv input: [None, 256, 14, 14] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/deconv output: [None, 256, 28, 28] | |
[32m[0308 18:58:58 @registry.py:125][0m maskrcnn/conv input: [None, 256, 28, 28] | |
W0308 18:58:58.598326 140106948318976 deprecation.py:506] From /home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/model_box.py:215: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. | |
Instructions for updating: | |
box_ind is deprecated, use box_indices instead | |
[buildtime_shape] [roi_heads, batch_idx 0] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn/conv output: [None, 80, 28, 28] | |
[32m[0308 18:58:58 @registry.py:133][0m maskrcnn output: [None, 80, 28, 28] | |
[buildtime_shape] [roi_heads, batch_idx 1] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
W0308 18:58:58.651162 140134360606464 deprecation.py:506] From /home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/model_box.py:215: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. | |
Instructions for updating: | |
box_ind is deprecated, use box_indices instead | |
[buildtime_shape] [maskrcnn_loss] mask_logits: (?, 80, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] fg_labels: (?,) | |
[buildtime_shape] [roi_heads, batch_idx 0] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] fg_target_masks: (?, 28, 28) | |
[buildtime_shape] [roi_heads, batch_idx 1] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] mask_logits: (?, 80, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] fg_labels: (?,) | |
[buildtime_shape] [maskrcnn_loss] fg_target_masks: (?, 28, 28) | |
W0308 18:58:58.803745 140085798336256 deprecation.py:323] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/array_grad.py:425: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Use tf.cast instead. | |
[32m[0308 18:58:58 @regularize.py:95][0m regularize_cost() found 63 variables to regularize. | |
[32m[0308 18:58:58 @regularize.py:20][0m The following tensors will be regularized: group1/block0/conv1/W:0, group1/block0/conv2/W:0, group1/block0/conv3/W:0, group1/block0/convshortcut/W:0, group1/block1/conv1/W:0, group1/block1/conv2/W:0, group1/block1/conv3/W:0, group1/block2/conv1/W:0, group1/block2/conv2/W:0, group1/block2/conv3/W:0, group1/block3/conv1/W:0, group1/block3/conv2/W:0, group1/block3/conv3/W:0, group2/block0/conv1/W:0, group2/block0/conv2/W:0, group2/block0/conv3/W:0, group2/block0/convshortcut/W:0, group2/block1/conv1/W:0, group2/block1/conv2/W:0, group2/block1/conv3/W:0, group2/block2/conv1/W:0, group2/block2/conv2/W:0, group2/block2/conv3/W:0, group2/block3/conv1/W:0, group2/block3/conv2/W:0, group2/block3/conv3/W:0, group2/block4/conv1/W:0, group2/block4/conv2/W:0, group2/block4/conv3/W:0, group2/block5/conv1/W:0, group2/block5/conv2/W:0, group2/block5/conv3/W:0, group3/block0/conv1/W:0, group3/block0/conv2/W:0, group3/block0/conv3/W:0, group3/block0/convshortcut/W:0, group3/block1/conv1/W:0, group3/block1/conv2/W:0, group3/block1/conv3/W:0, group3/block2/conv1/W:0, group3/block2/conv2/W:0, group3/block2/conv3/W:0, fpn/lateral_1x1_c2/W:0, fpn/lateral_1x1_c3/W:0, fpn/lateral_1x1_c4/W:0, fpn/lateral_1x1_c5/W:0, fpn/posthoc_3x3_p2/W:0, fpn/posthoc_3x3_p3/W:0, fpn/posthoc_3x3_p4/W:0, fpn/posthoc_3x3_p5/W:0, rpn/conv0/W:0, rpn/class/W:0, rpn/box/W:0, fastrcnn/fc6/W:0, fastrcnn/fc7/W:0, fastrcnn/outputs/class/W:0, fastrcnn/outputs/box/W:0, maskrcnn/fcn0/W:0, maskrcnn/fcn1/W:0, maskrcnn/fcn2/W:0, maskrcnn/fcn3/W:0, maskrcnn/deconv/W:0, maskrcnn/conv/W:0 | |
[32m[0308 18:58:58 @regularize.py:95][0m regularize_cost() found 63 variables to regularize. | |
[32m[0308 18:58:58 @regularize.py:20][0m The following tensors will be regularized: group1/block0/conv1/W:0, group1/block0/conv2/W:0, group1/block0/conv3/W:0, group1/block0/convshortcut/W:0, group1/block1/conv1/W:0, group1/block1/conv2/W:0, group1/block1/conv3/W:0, group1/block2/conv1/W:0, group1/block2/conv2/W:0, group1/block2/conv3/W:0, group1/block3/conv1/W:0, group1/block3/conv2/W:0, group1/block3/conv3/W:0, group2/block0/conv1/W:0, group2/block0/conv2/W:0, group2/block0/conv3/W:0, group2/block0/convshortcut/W:0, group2/block1/conv1/W:0, group2/block1/conv2/W:0, group2/block1/conv3/W:0, group2/block2/conv1/W:0, group2/block2/conv2/W:0, group2/block2/conv3/W:0, group2/block3/conv1/W:0, group2/block3/conv2/W:0, group2/block3/conv3/W:0, group2/block4/conv1/W:0, group2/block4/conv2/W:0, group2/block4/conv3/W:0, group2/block5/conv1/W:0, group2/block5/conv2/W:0, group2/block5/conv3/W:0, group3/block0/conv1/W:0, group3/block0/conv2/W:0, group3/block0/conv3/W:0, group3/block0/convshortcut/W:0, group3/block1/conv1/W:0, group3/block1/conv2/W:0, group3/block1/conv3/W:0, group3/block2/conv1/W:0, group3/block2/conv2/W:0, group3/block2/conv3/W:0, fpn/lateral_1x1_c2/W:0, fpn/lateral_1x1_c3/W:0, fpn/lateral_1x1_c4/W:0, fpn/lateral_1x1_c5/W:0, fpn/posthoc_3x3_p2/W:0, fpn/posthoc_3x3_p3/W:0, fpn/posthoc_3x3_p4/W:0, fpn/posthoc_3x3_p5/W:0, rpn/conv0/W:0, rpn/class/W:0, rpn/box/W:0, fastrcnn/fc6/W:0, fastrcnn/fc7/W:0, fastrcnn/outputs/class/W:0, fastrcnn/outputs/box/W:0, maskrcnn/fcn0/W:0, maskrcnn/fcn1/W:0, maskrcnn/fcn2/W:0, maskrcnn/fcn3/W:0, maskrcnn/deconv/W:0, maskrcnn/conv/W:0 | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
W0308 18:58:59.263675 140106948318976 deprecation.py:323] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/array_grad.py:425: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Use tf.cast instead. | |
W0308 18:58:59.287084 140134360606464 deprecation.py:323] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/array_grad.py:425: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Use tf.cast instead. | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
Done batching roidbs | |
[32m[0308 18:59:00 @train.py:577][0m Total passes of the training set is: 24.56 | |
[32m[0308 18:59:00 @trainers.py:391][0m [HorovodTrainer] local rank=3 | |
[32m[0308 18:59:00 @input_source.py:220][0m Setting up the queue 'QueueInput/input_queue' for CPU prefetching ... | |
WARNING: Logging before flag parsing goes to stderr. | |
W0308 18:59:00.313378 140223030650624 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/moving_averages.py:210: calling Zeros.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:59:00 @registry.py:125][0m conv0 input: [None, 3, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
W0308 18:59:00.370148 140223030650624 deprecation.py:506] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/init_ops.py:1253: calling VarianceScaling.__init__ (from tensorflow.python.ops.init_ops) with dtype is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Call initializer instance with the dtype argument instead of passing it to the constructor | |
[32m[0308 18:59:00 @registry.py:133][0m conv0 output: [None, 64, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m pool0 input: [None, 64, None, None] | |
[32m[0308 18:59:00 @registry.py:133][0m pool0 output: [None, 64, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block0/conv1 input: [None, 64, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block0/conv1 output: [None, 64, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block0/conv2 input: [None, 64, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block0/conv2 output: [None, 64, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block0/conv3 input: [None, 64, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block0/conv3 output: [None, 256, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block0/convshortcut input: [None, 64, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block0/convshortcut output: [None, 256, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block1/conv1 input: [None, 256, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block1/conv1 output: [None, 64, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block1/conv2 input: [None, 64, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block1/conv2 output: [None, 64, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block1/conv3 input: [None, 64, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block1/conv3 output: [None, 256, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block2/conv1 input: [None, 256, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block2/conv1 output: [None, 64, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block2/conv2 input: [None, 64, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block2/conv2 output: [None, 64, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group0/block2/conv3 input: [None, 64, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group0/block2/conv3 output: [None, 256, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group1/block0/conv1 input: [None, 256, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group1/block0/conv1 output: [None, 128, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group1/block0/conv2 input: [None, 128, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group1/block0/conv2 output: [None, 128, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group1/block0/conv3 input: [None, 128, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group1/block0/conv3 output: [None, 512, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group1/block0/convshortcut input: [None, 256, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:00 @registry.py:133][0m group1/block0/convshortcut output: [None, 512, None, None] | |
[32m[0308 18:59:00 @registry.py:125][0m group1/block1/conv1 input: [None, 512, None, None] | |
[32m[0308 18:59:00 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [proposal_metrics_batch] mean_of_mean_best_iou: () | |
[32m[0308 18:59:01 @registry.py:133][0m group1/block1/conv1 output: [None, 128, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group1/block1/conv2 input: [None, 128, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group1/block1/conv2 output: [None, 128, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group1/block1/conv3 input: [None, 128, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group1/block1/conv3 output: [None, 512, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group1/block2/conv1 input: [None, 512, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=0: (?, 5) | |
[32m[0308 18:59:01 @registry.py:133][0m group1/block2/conv1 output: [None, 128, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group1/block2/conv2 input: [None, 128, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:59:01 @registry.py:133][0m group1/block2/conv2 output: [None, 128, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group1/block2/conv3 input: [None, 128, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group1/block2/conv3 output: [None, 512, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group1/block3/conv1 input: [None, 512, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group1/block3/conv1 output: [None, 128, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group1/block3/conv2 input: [None, 128, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group1/block3/conv2 output: [None, 128, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group1/block3/conv3 input: [None, 128, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:125][0m fastrcnn input: [None, 256, 7, 7] | |
[32m[0308 18:59:01 @registry.py:125][0m fastrcnn/fc6 input: [None, 256, 7, 7] | |
[32m[0308 18:59:01 @registry.py:133][0m group1/block3/conv3 output: [None, 512, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block0/conv1 input: [None, 512, None, None] | |
[32m[0308 18:59:01 @registry.py:133][0m fastrcnn/fc6 output: [None, 1024] | |
[32m[0308 18:59:01 @registry.py:125][0m fastrcnn/fc7 input: [None, 1024] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m fastrcnn/fc7 output: [None, 1024] | |
[32m[0308 18:59:01 @registry.py:133][0m fastrcnn output: [None, 1024] | |
[buildtime_shape] [train.roi_heads] head_feature: (?, 1024) | |
[32m[0308 18:59:01 @registry.py:125][0m fastrcnn/outputs input: [None, 1024] | |
[32m[0308 18:59:01 @registry.py:125][0m fastrcnn/outputs/class input: [None, 1024] | |
[32m[0308 18:59:01 @registry.py:133][0m fastrcnn/outputs/class output: [None, 81] | |
[32m[0308 18:59:01 @registry.py:125][0m fastrcnn/outputs/box input: [None, 1024] | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block0/conv1 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block0/conv2 input: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:133][0m fastrcnn/outputs/box output: [None, 324] | |
[32m[0308 18:59:01 @registry.py:133][0m fastrcnn/outputs output: [None, 81],[None, 81, 4] | |
self.training == True | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block0/conv2 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block0/conv3 input: [None, 256, None, None] | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block0/conv3 output: [None, 1024, None, None] | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block0/convshortcut input: [None, 512, None, None] | |
labels Tensor("concat:0", shape=(?,), dtype=int64) | |
label_logits Tensor("concat_1:0", shape=(?, 81), dtype=float32) | |
fg_boxes Tensor("concat_2:0", shape=(?, 4), dtype=float32) | |
fg_box_logits Tensor("concat_3:0", shape=(?, 81, 4), dtype=float32) | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block0/convshortcut output: [None, 1024, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block1/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block1/conv1 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block1/conv2 input: [None, 256, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @monitor.py:257][0m [5m[31mWRN[0m logger directory was not set. Ignore TFEventWriter. | |
[32m[0308 18:59:01 @monitor.py:298][0m [5m[31mWRN[0m logger directory was not set. Ignore JSONWriter. | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block1/conv2 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block1/conv3 input: [None, 256, None, None] | |
[32m[0308 18:59:01 @model_utils.py:64][0m [36mTrainable Variables: | |
[0mname shape dim | |
------------------------------------- ------------------ -------- | |
group1/block0/conv1/W:0 [1, 1, 256, 128] 32768 | |
group1/block0/conv1/bn/gamma:0 [128] 128 | |
group1/block0/conv1/bn/beta:0 [128] 128 | |
group1/block0/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block0/conv2/bn/gamma:0 [128] 128 | |
group1/block0/conv2/bn/beta:0 [128] 128 | |
group1/block0/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block0/conv3/bn/gamma:0 [512] 512 | |
group1/block0/conv3/bn/beta:0 [512] 512 | |
group1/block0/convshortcut/W:0 [1, 1, 256, 512] 131072 | |
group1/block0/convshortcut/bn/gamma:0 [512] 512 | |
group1/block0/convshortcut/bn/beta:0 [512] 512 | |
group1/block1/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block1/conv1/bn/gamma:0 [128] 128 | |
group1/block1/conv1/bn/beta:0 [128] 128 | |
group1/block1/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block1/conv2/bn/gamma:0 [128] 128 | |
group1/block1/conv2/bn/beta:0 [128] 128 | |
group1/block1/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block1/conv3/bn/gamma:0 [512] 512 | |
group1/block1/conv3/bn/beta:0 [512] 512 | |
group1/block2/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block2/conv1/bn/gamma:0 [128] 128 | |
group1/block2/conv1/bn/beta:0 [128] 128 | |
group1/block2/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block2/conv2/bn/gamma:0 [128] 128 | |
group1/block2/conv2/bn/beta:0 [128] 128 | |
group1/block2/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block2/conv3/bn/gamma:0 [512] 512 | |
group1/block2/conv3/bn/beta:0 [512] 512 | |
group1/block3/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block3/conv1/bn/gamma:0 [128] 128 | |
group1/block3/conv1/bn/beta:0 [128] 128 | |
group1/block3/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block3/conv2/bn/gamma:0 [128] 128 | |
group1/block3/conv2/bn/beta:0 [128] 128 | |
group1/block3/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block3/conv3/bn/gamma:0 [512] 512 | |
group1/block3/conv3/bn/beta:0 [512] 512 | |
group2/block0/conv1/W:0 [1, 1, 512, 256] 131072 | |
group2/block0/conv1/bn/gamma:0 [256] 256 | |
group2/block0/conv1/bn/beta:0 [256] 256 | |
group2/block0/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block0/conv2/bn/gamma:0 [256] 256 | |
group2/block0/conv2/bn/beta:0 [256] 256 | |
group2/block0/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block0/conv3/bn/gamma:0 [1024] 1024 | |
group2/block0/conv3/bn/beta:0 [1024] 1024 | |
group2/block0/convshortcut/W:0 [1, 1, 512, 1024] 524288 | |
group2/block0/convshortcut/bn/gamma:0 [1024] 1024 | |
group2/block0/convshortcut/bn/beta:0 [1024] 1024 | |
group2/block1/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block1/conv1/bn/gamma:0 [256] 256 | |
group2/block1/conv1/bn/beta:0 [256] 256 | |
group2/block1/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block1/conv2/bn/gamma:0 [256] 256 | |
group2/block1/conv2/bn/beta:0 [256] 256 | |
group2/block1/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block1/conv3/bn/gamma:0 [1024] 1024 | |
group2/block1/conv3/bn/beta:0 [1024] 1024 | |
group2/block2/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block2/conv1/bn/gamma:0 [256] 256 | |
group2/block2/conv1/bn/beta:0 [256] 256 | |
group2/block2/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block2/conv2/bn/gamma:0 [256] 256 | |
group2/block2/conv2/bn/beta:0 [256] 256 | |
group2/block2/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block2/conv3/bn/gamma:0 [1024] 1024 | |
group2/block2/conv3/bn/beta:0 [1024] 1024 | |
group2/block3/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block3/conv1/bn/gamma:0 [256] 256 | |
group2/block3/conv1/bn/beta:0 [256] 256 | |
group2/block3/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block3/conv2/bn/gamma:0 [256] 256 | |
group2/block3/conv2/bn/beta:0 [256] 256 | |
group2/block3/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block3/conv3/bn/gamma:0 [1024] 1024 | |
group2/block3/conv3/bn/beta:0 [1024] 1024 | |
group2/block4/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block4/conv1/bn/gamma:0 [256] 256 | |
group2/block4/conv1/bn/beta:0 [256] 256 | |
group2/block4/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block4/conv2/bn/gamma:0 [256] 256 | |
group2/block4/conv2/bn/beta:0 [256] 256 | |
group2/block4/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block4/conv3/bn/gamma:0 [1024] 1024 | |
group2/block4/conv3/bn/beta:0 [1024] 1024 | |
group2/block5/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block5/conv1/bn/gamma:0 [256] 256 | |
group2/block5/conv1/bn/beta:0 [256] 256 | |
group2/block5/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block5/conv2/bn/gamma:0 [256] 256 | |
group2/block5/conv2/bn/beta:0 [256] 256 | |
group2/block5/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block5/conv3/bn/gamma:0 [1024] 1024 | |
group2/block5/conv3/bn/beta:0 [1024] 1024 | |
group3/block0/conv1/W:0 [1, 1, 1024, 512] 524288 | |
group3/block0/conv1/bn/gamma:0 [512] 512 | |
group3/block0/conv1/bn/beta:0 [512] 512 | |
group3/block0/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block0/conv2/bn/gamma:0 [512] 512 | |
group3/block0/conv2/bn/beta:0 [512] 512 | |
group3/block0/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block0/conv3/bn/gamma:0 [2048] 2048 | |
group3/block0/conv3/bn/beta:0 [2048] 2048 | |
group3/block0/convshortcut/W:0 [1, 1, 1024, 2048] 2097152 | |
group3/block0/convshortcut/bn/gamma:0 [2048] 2048 | |
group3/block0/convshortcut/bn/beta:0 [2048] 2048 | |
group3/block1/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block1/conv1/bn/gamma:0 [512] 512 | |
group3/block1/conv1/bn/beta:0 [512] 512 | |
group3/block1/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block1/conv2/bn/gamma:0 [512] 512 | |
group3/block1/conv2/bn/beta:0 [512] 512 | |
group3/block1/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block1/conv3/bn/gamma:0 [2048] 2048 | |
group3/block1/conv3/bn/beta:0 [2048] 2048 | |
group3/block2/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block2/conv1/bn/gamma:0 [512] 512 | |
group3/block2/conv1/bn/beta:0 [512] 512 | |
group3/block2/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block2/conv2/bn/gamma:0 [512] 512 | |
group3/block2/conv2/bn/beta:0 [512] 512 | |
group3/block2/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block2/conv3/bn/gamma:0 [2048] 2048 | |
group3/block2/conv3/bn/beta:0 [2048] 2048 | |
fpn/lateral_1x1_c2/W:0 [1, 1, 256, 256] 65536 | |
fpn/lateral_1x1_c2/b:0 [256] 256 | |
fpn/lateral_1x1_c3/W:0 [1, 1, 512, 256] 131072 | |
fpn/lateral_1x1_c3/b:0 [256] 256 | |
fpn/lateral_1x1_c4/W:0 [1, 1, 1024, 256] 262144 | |
fpn/lateral_1x1_c4/b:0 [256] 256 | |
fpn/lateral_1x1_c5/W:0 [1, 1, 2048, 256] 524288 | |
fpn/lateral_1x1_c5/b:0 [256] 256 | |
fpn/posthoc_3x3_p2/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p2/b:0 [256] 256 | |
fpn/posthoc_3x3_p3/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p3/b:0 [256] 256 | |
fpn/posthoc_3x3_p4/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p4/b:0 [256] 256 | |
fpn/posthoc_3x3_p5/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p5/b:0 [256] 256 | |
rpn/conv0/W:0 [3, 3, 256, 256] 589824 | |
rpn/conv0/b:0 [256] 256 | |
rpn/class/W:0 [1, 1, 256, 3] 768 | |
rpn/class/b:0 [3] 3 | |
rpn/box/W:0 [1, 1, 256, 12] 3072 | |
rpn/box/b:0 [12] 12 | |
fastrcnn/fc6/W:0 [12544, 1024] 12845056 | |
fastrcnn/fc6/b:0 [1024] 1024 | |
fastrcnn/fc7/W:0 [1024, 1024] 1048576 | |
fastrcnn/fc7/b:0 [1024] 1024 | |
fastrcnn/outputs/class/W:0 [1024, 81] 82944 | |
fastrcnn/outputs/class/b:0 [81] 81 | |
fastrcnn/outputs/box/W:0 [1024, 324] 331776 | |
fastrcnn/outputs/box/b:0 [324] 324 | |
maskrcnn/fcn0/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn0/b:0 [256] 256 | |
maskrcnn/fcn1/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn1/b:0 [256] 256 | |
maskrcnn/fcn2/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn2/b:0 [256] 256 | |
maskrcnn/fcn3/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn3/b:0 [256] 256 | |
maskrcnn/deconv/W:0 [2, 2, 256, 256] 262144 | |
maskrcnn/deconv/b:0 [256] 256 | |
maskrcnn/conv/W:0 [1, 1, 256, 80] 20480 | |
maskrcnn/conv/b:0 [80] 80[36m | |
Total #vars=168, #params=44175092, size=168.51MB[0m | |
[32m[0308 18:59:01 @base.py:160][0m [5m[31mWRN[0m Callback PeriodicCallback-ModelSaver is chief-only, skipped. | |
[32m[0308 18:59:01 @base.py:160][0m [5m[31mWRN[0m Callback EstimatedTimeLeft is chief-only, skipped. | |
[32m[0308 18:59:01 @base.py:160][0m [5m[31mWRN[0m Callback SessionRunTimeout is chief-only, skipped. | |
[32m[0308 18:59:01 @base.py:160][0m [5m[31mWRN[0m Callback ThroughputTracker is chief-only, skipped. | |
[32m[0308 18:59:01 @base.py:160][0m [5m[31mWRN[0m Callback MovingAverageSummary is chief-only, skipped. | |
[32m[0308 18:59:01 @base.py:160][0m [5m[31mWRN[0m Callback MergeAllSummaries_RunWithOp is chief-only, skipped. | |
[32m[0308 18:59:01 @base.py:208][0m Setup callbacks graph ... | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block1/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block2/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block2/conv1 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block2/conv2 input: [None, 256, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [proposal_metrics_batch] mean_of_mean_best_iou: () | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block2/conv2 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block2/conv3 input: [None, 256, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block2/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block3/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block3/conv1 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block3/conv2 input: [None, 256, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:125][0m maskrcnn input: [None, 256, 14, 14] | |
[32m[0308 18:59:01 @registry.py:125][0m maskrcnn/fcn0 input: [None, 256, 14, 14] | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=0: (?, 5) | |
[32m[0308 18:59:01 @registry.py:133][0m maskrcnn/fcn0 output: [None, 256, 14, 14] | |
[32m[0308 18:59:01 @registry.py:125][0m maskrcnn/fcn1 input: [None, 256, 14, 14] | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=0: (?, 5) | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block3/conv2 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block3/conv3 input: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:133][0m maskrcnn/fcn1 output: [None, 256, 14, 14] | |
[32m[0308 18:59:01 @registry.py:125][0m maskrcnn/fcn2 input: [None, 256, 14, 14] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=1: (?, 5) | |
[32m[0308 18:59:01 @registry.py:133][0m maskrcnn/fcn2 output: [None, 256, 14, 14] | |
[32m[0308 18:59:01 @registry.py:125][0m maskrcnn/fcn3 input: [None, 256, 14, 14] | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[32m[0308 18:59:01 @registry.py:133][0m maskrcnn/fcn3 output: [None, 256, 14, 14] | |
[32m[0308 18:59:01 @registry.py:125][0m maskrcnn/deconv input: [None, 256, 14, 14] | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block3/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block4/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:01 @registry.py:133][0m maskrcnn/deconv output: [None, 256, 28, 28] | |
[32m[0308 18:59:01 @registry.py:125][0m maskrcnn/conv input: [None, 256, 28, 28] | |
[32m[0308 18:59:01 @registry.py:133][0m maskrcnn/conv output: [None, 80, 28, 28] | |
[32m[0308 18:59:01 @registry.py:133][0m maskrcnn output: [None, 80, 28, 28] | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block4/conv1 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block4/conv2 input: [None, 256, None, None] | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
W0308 18:59:01.962667 139814253213440 deprecation.py:506] From /home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/model_box.py:215: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. | |
Instructions for updating: | |
box_ind is deprecated, use box_indices instead | |
[32m[0308 18:59:01 @registry.py:133][0m group2/block4/conv2 output: [None, 256, None, None] | |
[32m[0308 18:59:01 @registry.py:125][0m group2/block4/conv3 input: [None, 256, None, None] | |
[buildtime_shape] [roi_heads, batch_idx 0] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[32m[0308 18:59:01 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m group2/block4/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group2/block5/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fastrcnn input: [None, 256, 7, 7] | |
[32m[0308 18:59:02 @registry.py:125][0m fastrcnn/fc6 input: [None, 256, 7, 7] | |
[buildtime_shape] [roi_heads, batch_idx 1] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] mask_logits: (?, 80, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] fg_labels: (?,) | |
[buildtime_shape] [maskrcnn_loss] fg_target_masks: (?, 28, 28) | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m fastrcnn/fc6 output: [None, 1024] | |
[32m[0308 18:59:02 @registry.py:125][0m fastrcnn/fc7 input: [None, 1024] | |
[32m[0308 18:59:02 @registry.py:133][0m fastrcnn/fc7 output: [None, 1024] | |
[32m[0308 18:59:02 @registry.py:133][0m fastrcnn output: [None, 1024] | |
[buildtime_shape] [train.roi_heads] head_feature: (?, 1024) | |
[32m[0308 18:59:02 @registry.py:125][0m fastrcnn/outputs input: [None, 1024] | |
[32m[0308 18:59:02 @registry.py:125][0m fastrcnn/outputs/class input: [None, 1024] | |
[32m[0308 18:59:02 @registry.py:133][0m group2/block5/conv1 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group2/block5/conv2 input: [None, 256, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m fastrcnn/outputs/class output: [None, 81] | |
[32m[0308 18:59:02 @registry.py:125][0m fastrcnn/outputs/box input: [None, 1024] | |
[32m[0308 18:59:02 @registry.py:133][0m fastrcnn/outputs/box output: [None, 324] | |
[32m[0308 18:59:02 @registry.py:133][0m fastrcnn/outputs output: [None, 81],[None, 81, 4] | |
self.training == True | |
[32m[0308 18:59:02 @registry.py:133][0m group2/block5/conv2 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group2/block5/conv3 input: [None, 256, None, None] | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[32m[0308 18:59:02 @registry.py:133][0m group2/block5/conv3 output: [None, 1024, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block0/conv1 input: [None, 1024, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
labels Tensor("concat:0", shape=(?,), dtype=int64) | |
label_logits Tensor("concat_1:0", shape=(?, 81), dtype=float32) | |
fg_boxes Tensor("concat_2:0", shape=(?, 4), dtype=float32) | |
fg_box_logits Tensor("concat_3:0", shape=(?, 81, 4), dtype=float32) | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block0/conv1 output: [None, 512, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block0/conv2 input: [None, 512, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block0/conv2 output: [None, 512, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block0/conv3 input: [None, 512, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @regularize.py:95][0m regularize_cost() found 63 variables to regularize. | |
[32m[0308 18:59:02 @regularize.py:20][0m The following tensors will be regularized: group1/block0/conv1/W:0, group1/block0/conv2/W:0, group1/block0/conv3/W:0, group1/block0/convshortcut/W:0, group1/block1/conv1/W:0, group1/block1/conv2/W:0, group1/block1/conv3/W:0, group1/block2/conv1/W:0, group1/block2/conv2/W:0, group1/block2/conv3/W:0, group1/block3/conv1/W:0, group1/block3/conv2/W:0, group1/block3/conv3/W:0, group2/block0/conv1/W:0, group2/block0/conv2/W:0, group2/block0/conv3/W:0, group2/block0/convshortcut/W:0, group2/block1/conv1/W:0, group2/block1/conv2/W:0, group2/block1/conv3/W:0, group2/block2/conv1/W:0, group2/block2/conv2/W:0, group2/block2/conv3/W:0, group2/block3/conv1/W:0, group2/block3/conv2/W:0, group2/block3/conv3/W:0, group2/block4/conv1/W:0, group2/block4/conv2/W:0, group2/block4/conv3/W:0, group2/block5/conv1/W:0, group2/block5/conv2/W:0, group2/block5/conv3/W:0, group3/block0/conv1/W:0, group3/block0/conv2/W:0, group3/block0/conv3/W:0, group3/block0/convshortcut/W:0, group3/block1/conv1/W:0, group3/block1/conv2/W:0, group3/block1/conv3/W:0, group3/block2/conv1/W:0, group3/block2/conv2/W:0, group3/block2/conv3/W:0, fpn/lateral_1x1_c2/W:0, fpn/lateral_1x1_c3/W:0, fpn/lateral_1x1_c4/W:0, fpn/lateral_1x1_c5/W:0, fpn/posthoc_3x3_p2/W:0, fpn/posthoc_3x3_p3/W:0, fpn/posthoc_3x3_p4/W:0, fpn/posthoc_3x3_p5/W:0, rpn/conv0/W:0, rpn/class/W:0, rpn/box/W:0, fastrcnn/fc6/W:0, fastrcnn/fc7/W:0, fastrcnn/outputs/class/W:0, fastrcnn/outputs/box/W:0, maskrcnn/fcn0/W:0, maskrcnn/fcn1/W:0, maskrcnn/fcn2/W:0, maskrcnn/fcn3/W:0, maskrcnn/deconv/W:0, maskrcnn/conv/W:0 | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block0/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block0/convshortcut input: [None, 1024, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block0/convshortcut output: [None, 2048, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block1/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block1/conv1 output: [None, 512, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block1/conv2 input: [None, 512, None, None] | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block1/conv2 output: [None, 512, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block1/conv3 input: [None, 512, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block1/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block2/conv1 input: [None, 2048, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block2/conv1 output: [None, 512, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block2/conv2 input: [None, 512, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:125][0m maskrcnn input: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @registry.py:125][0m maskrcnn/fcn0 input: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @registry.py:133][0m maskrcnn/fcn0 output: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @registry.py:125][0m maskrcnn/fcn1 input: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block2/conv2 output: [None, 512, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m group3/block2/conv3 input: [None, 512, None, None] | |
[32m[0308 18:59:02 @batch_norm.py:164][0m [5m[31mWRN[0m [BatchNorm] Using moving_mean/moving_variance in training. | |
[32m[0308 18:59:02 @registry.py:133][0m maskrcnn/fcn1 output: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @registry.py:125][0m maskrcnn/fcn2 input: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @base.py:229][0m Creating the session ... | |
[32m[0308 18:59:02 @registry.py:133][0m maskrcnn/fcn2 output: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @registry.py:125][0m maskrcnn/fcn3 input: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @registry.py:133][0m group3/block2/conv3 output: [None, 2048, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn input: [None, 256, None, None],[None, 512, None, None],[None, 1024, None, None],[None, 2048, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/lateral_1x1_c2 input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m maskrcnn/fcn3 output: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @registry.py:125][0m maskrcnn/deconv input: [None, 256, 14, 14] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/lateral_1x1_c2 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/lateral_1x1_c3 input: [None, 512, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m maskrcnn/deconv output: [None, 256, 28, 28] | |
[32m[0308 18:59:02 @registry.py:125][0m maskrcnn/conv input: [None, 256, 28, 28] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/lateral_1x1_c3 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/lateral_1x1_c4 input: [None, 1024, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m maskrcnn/conv output: [None, 80, 28, 28] | |
[32m[0308 18:59:02 @registry.py:133][0m maskrcnn output: [None, 80, 28, 28] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/lateral_1x1_c4 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/lateral_1x1_c5 input: [None, 2048, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/lateral_1x1_c5 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/upsample_lat5 input: [None, 256, None, None] | |
[32m[0308 18:59:02 @monitor.py:257][0m [5m[31mWRN[0m logger directory was not set. Ignore TFEventWriter. | |
[32m[0308 18:59:02 @monitor.py:298][0m [5m[31mWRN[0m logger directory was not set. Ignore JSONWriter. | |
W0308 18:59:02.619998 139814253213440 deprecation.py:323] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/array_grad.py:425: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Use tf.cast instead. | |
[32m[0308 18:59:02 @model_utils.py:64][0m [36mTrainable Variables: | |
[0mname shape dim | |
------------------------------------- ------------------ -------- | |
group1/block0/conv1/W:0 [1, 1, 256, 128] 32768 | |
group1/block0/conv1/bn/gamma:0 [128] 128 | |
group1/block0/conv1/bn/beta:0 [128] 128 | |
group1/block0/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block0/conv2/bn/gamma:0 [128] 128 | |
group1/block0/conv2/bn/beta:0 [128] 128 | |
group1/block0/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block0/conv3/bn/gamma:0 [512] 512 | |
group1/block0/conv3/bn/beta:0 [512] 512 | |
group1/block0/convshortcut/W:0 [1, 1, 256, 512] 131072 | |
group1/block0/convshortcut/bn/gamma:0 [512] 512 | |
group1/block0/convshortcut/bn/beta:0 [512] 512 | |
group1/block1/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block1/conv1/bn/gamma:0 [128] 128 | |
group1/block1/conv1/bn/beta:0 [128] 128 | |
group1/block1/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block1/conv2/bn/gamma:0 [128] 128 | |
group1/block1/conv2/bn/beta:0 [128] 128 | |
group1/block1/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block1/conv3/bn/gamma:0 [512] 512 | |
group1/block1/conv3/bn/beta:0 [512] 512 | |
group1/block2/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block2/conv1/bn/gamma:0 [128] 128 | |
group1/block2/conv1/bn/beta:0 [128] 128 | |
group1/block2/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block2/conv2/bn/gamma:0 [128] 128 | |
group1/block2/conv2/bn/beta:0 [128] 128 | |
group1/block2/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block2/conv3/bn/gamma:0 [512] 512 | |
group1/block2/conv3/bn/beta:0 [512] 512 | |
group1/block3/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block3/conv1/bn/gamma:0 [128] 128 | |
group1/block3/conv1/bn/beta:0 [128] 128 | |
group1/block3/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block3/conv2/bn/gamma:0 [128] 128 | |
group1/block3/conv2/bn/beta:0 [128] 128 | |
group1/block3/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block3/conv3/bn/gamma:0 [512] 512 | |
group1/block3/conv3/bn/beta:0 [512] 512 | |
group2/block0/conv1/W:0 [1, 1, 512, 256] 131072 | |
group2/block0/conv1/bn/gamma:0 [256] 256 | |
group2/block0/conv1/bn/beta:0 [256] 256 | |
group2/block0/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block0/conv2/bn/gamma:0 [256] 256 | |
group2/block0/conv2/bn/beta:0 [256] 256 | |
group2/block0/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block0/conv3/bn/gamma:0 [1024] 1024 | |
group2/block0/conv3/bn/beta:0 [1024] 1024 | |
group2/block0/convshortcut/W:0 [1, 1, 512, 1024] 524288 | |
group2/block0/convshortcut/bn/gamma:0 [1024] 1024 | |
group2/block0/convshortcut/bn/beta:0 [1024] 1024 | |
group2/block1/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block1/conv1/bn/gamma:0 [256] 256 | |
group2/block1/conv1/bn/beta:0 [256] 256 | |
group2/block1/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block1/conv2/bn/gamma:0 [256] 256 | |
group2/block1/conv2/bn/beta:0 [256] 256 | |
group2/block1/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block1/conv3/bn/gamma:0 [1024] 1024 | |
group2/block1/conv3/bn/beta:0 [1024] 1024 | |
group2/block2/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block2/conv1/bn/gamma:0 [256] 256 | |
group2/block2/conv1/bn/beta:0 [256] 256 | |
group2/block2/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block2/conv2/bn/gamma:0 [256] 256 | |
group2/block2/conv2/bn/beta:0 [256] 256 | |
group2/block2/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block2/conv3/bn/gamma:0 [1024] 1024 | |
group2/block2/conv3/bn/beta:0 [1024] 1024 | |
group2/block3/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block3/conv1/bn/gamma:0 [256] 256 | |
group2/block3/conv1/bn/beta:0 [256] 256 | |
group2/block3/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block3/conv2/bn/gamma:0 [256] 256 | |
group2/block3/conv2/bn/beta:0 [256] 256 | |
group2/block3/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block3/conv3/bn/gamma:0 [1024] 1024 | |
group2/block3/conv3/bn/beta:0 [1024] 1024 | |
group2/block4/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block4/conv1/bn/gamma:0 [256] 256 | |
group2/block4/conv1/bn/beta:0 [256] 256 | |
group2/block4/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block4/conv2/bn/gamma:0 [256] 256 | |
group2/block4/conv2/bn/beta:0 [256] 256 | |
group2/block4/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block4/conv3/bn/gamma:0 [1024] 1024 | |
group2/block4/conv3/bn/beta:0 [1024] 1024 | |
group2/block5/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block5/conv1/bn/gamma:0 [256] 256 | |
group2/block5/conv1/bn/beta:0 [256] 256 | |
group2/block5/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block5/conv2/bn/gamma:0 [256] 256 | |
group2/block5/conv2/bn/beta:0 [256] 256 | |
group2/block5/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block5/conv3/bn/gamma:0 [1024] 1024 | |
group2/block5/conv3/bn/beta:0 [1024] 1024 | |
group3/block0/conv1/W:0 [1, 1, 1024, 512] 524288 | |
group3/block0/conv1/bn/gamma:0 [512] 512 | |
group3/block0/conv1/bn/beta:0 [512] 512 | |
group3/block0/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block0/conv2/bn/gamma:0 [512] 512 | |
group3/block0/conv2/bn/beta:0 [512] 512 | |
group3/block0/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block0/conv3/bn/gamma:0 [2048] 2048 | |
group3/block0/conv3/bn/beta:0 [2048] 2048 | |
group3/block0/convshortcut/W:0 [1, 1, 1024, 2048] 2097152 | |
group3/block0/convshortcut/bn/gamma:0 [2048] 2048 | |
group3/block0/convshortcut/bn/beta:0 [2048] 2048 | |
group3/block1/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block1/conv1/bn/gamma:0 [512] 512 | |
group3/block1/conv1/bn/beta:0 [512] 512 | |
group3/block1/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block1/conv2/bn/gamma:0 [512] 512 | |
group3/block1/conv2/bn/beta:0 [512] 512 | |
group3/block1/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block1/conv3/bn/gamma:0 [2048] 2048 | |
group3/block1/conv3/bn/beta:0 [2048] 2048 | |
group3/block2/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block2/conv1/bn/gamma:0 [512] 512 | |
group3/block2/conv1/bn/beta:0 [512] 512 | |
group3/block2/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block2/conv2/bn/gamma:0 [512] 512 | |
group3/block2/conv2/bn/beta:0 [512] 512 | |
group3/block2/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block2/conv3/bn/gamma:0 [2048] 2048 | |
group3/block2/conv3/bn/beta:0 [2048] 2048 | |
fpn/lateral_1x1_c2/W:0 [1, 1, 256, 256] 65536 | |
fpn/lateral_1x1_c2/b:0 [256] 256 | |
fpn/lateral_1x1_c3/W:0 [1, 1, 512, 256] 131072 | |
fpn/lateral_1x1_c3/b:0 [256] 256 | |
fpn/lateral_1x1_c4/W:0 [1, 1, 1024, 256] 262144 | |
fpn/lateral_1x1_c4/b:0 [256] 256 | |
fpn/lateral_1x1_c5/W:0 [1, 1, 2048, 256] 524288 | |
fpn/lateral_1x1_c5/b:0 [256] 256 | |
fpn/posthoc_3x3_p2/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p2/b:0 [256] 256 | |
fpn/posthoc_3x3_p3/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p3/b:0 [256] 256 | |
fpn/posthoc_3x3_p4/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p4/b:0 [256] 256 | |
fpn/posthoc_3x3_p5/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p5/b:0 [256] 256 | |
rpn/conv0/W:0 [3, 3, 256, 256] 589824 | |
rpn/conv0/b:0 [256] 256 | |
rpn/class/W:0 [1, 1, 256, 3] 768 | |
rpn/class/b:0 [3] 3 | |
rpn/box/W:0 [1, 1, 256, 12] 3072 | |
rpn/box/b:0 [12] 12 | |
fastrcnn/fc6/W:0 [12544, 1024] 12845056 | |
fastrcnn/fc6/b:0 [1024] 1024 | |
fastrcnn/fc7/W:0 [1024, 1024] 1048576 | |
fastrcnn/fc7/b:0 [1024] 1024 | |
fastrcnn/outputs/class/W:0 [1024, 81] 82944 | |
fastrcnn/outputs/class/b:0 [81] 81 | |
fastrcnn/outputs/box/W:0 [1024, 324] 331776 | |
fastrcnn/outputs/box/b:0 [324] 324 | |
maskrcnn/fcn0/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn0/b:0 [256] 256 | |
maskrcnn/fcn1/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn1/b:0 [256] 256 | |
maskrcnn/fcn2/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn2/b:0 [256] 256 | |
maskrcnn/fcn3/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn3/b:0 [256] 256 | |
maskrcnn/deconv/W:0 [2, 2, 256, 256] 262144 | |
maskrcnn/deconv/b:0 [256] 256 | |
maskrcnn/conv/W:0 [1, 1, 256, 80] 20480 | |
maskrcnn/conv/b:0 [80] 80[36m | |
Total #vars=168, #params=44175092, size=168.51MB[0m | |
[32m[0308 18:59:02 @base.py:160][0m [5m[31mWRN[0m Callback PeriodicCallback-ModelSaver is chief-only, skipped. | |
[32m[0308 18:59:02 @base.py:160][0m [5m[31mWRN[0m Callback EstimatedTimeLeft is chief-only, skipped. | |
[32m[0308 18:59:02 @base.py:160][0m [5m[31mWRN[0m Callback SessionRunTimeout is chief-only, skipped. | |
[32m[0308 18:59:02 @base.py:160][0m [5m[31mWRN[0m Callback ThroughputTracker is chief-only, skipped. | |
[32m[0308 18:59:02 @base.py:160][0m [5m[31mWRN[0m Callback MovingAverageSummary is chief-only, skipped. | |
[32m[0308 18:59:02 @base.py:160][0m [5m[31mWRN[0m Callback MergeAllSummaries_RunWithOp is chief-only, skipped. | |
[32m[0308 18:59:02 @base.py:208][0m Setup callbacks graph ... | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/upsample_lat5 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/upsample_lat4 input: [None, 256, None, None] | |
W0308 18:59:02.643932 139697895950080 deprecation.py:506] From /home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/model_box.py:215: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. | |
Instructions for updating: | |
box_ind is deprecated, use box_indices instead | |
[buildtime_shape] [roi_heads, batch_idx 0] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/upsample_lat4 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/upsample_lat3 input: [None, 256, None, None] | |
[buildtime_shape] [roi_heads, batch_idx 1] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/upsample_lat3 output: [None, 256, None, None] | |
[buildtime_shape] [maskrcnn_loss] mask_logits: (?, 80, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] fg_labels: (?,) | |
[buildtime_shape] [maskrcnn_loss] fg_target_masks: (?, 28, 28) | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/posthoc_3x3_p2 input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/posthoc_3x3_p2 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/posthoc_3x3_p3 input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/posthoc_3x3_p3 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/posthoc_3x3_p4 input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/posthoc_3x3_p4 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/posthoc_3x3_p5 input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/posthoc_3x3_p5 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m fpn/maxpool_p6 input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn/maxpool_p6 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m fpn output: [None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None],[None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m rpn input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m rpn/conv0 input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m rpn/conv0 output: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m rpn/class input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m rpn/class output: [None, 3, None, None] | |
[32m[0308 18:59:02 @registry.py:125][0m rpn/box input: [None, 256, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m rpn/box output: [None, 12, None, None] | |
[32m[0308 18:59:02 @registry.py:133][0m rpn output: [None, None, None, 3],[None, 12, None, None] | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 0: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 0: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 0: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 0: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 1: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 1: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 1: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 1: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 2: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 2: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 2: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 2: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 3: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 3: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 3: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 3: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] scores, lvl 4: (?, 3, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] bbox_deltas (reshaped), lvl 4: (?, 12, ?, ?) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] im_info, lvl 4: (?, 2) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] anchors, lvl 4: (3, 4) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (0): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_boxes (1): (?, 5) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (0): (?,) | |
[buildtime_shape] [model_fpn.generate_fpn_proposals_batch_tf_op] proposal_scores (1): (?,) | |
[32m[0308 18:59:02 @regularize.py:95][0m regularize_cost() found 63 variables to regularize. | |
[32m[0308 18:59:02 @regularize.py:20][0m The following tensors will be regularized: group1/block0/conv1/W:0, group1/block0/conv2/W:0, group1/block0/conv3/W:0, group1/block0/convshortcut/W:0, group1/block1/conv1/W:0, group1/block1/conv2/W:0, group1/block1/conv3/W:0, group1/block2/conv1/W:0, group1/block2/conv2/W:0, group1/block2/conv3/W:0, group1/block3/conv1/W:0, group1/block3/conv2/W:0, group1/block3/conv3/W:0, group2/block0/conv1/W:0, group2/block0/conv2/W:0, group2/block0/conv3/W:0, group2/block0/convshortcut/W:0, group2/block1/conv1/W:0, group2/block1/conv2/W:0, group2/block1/conv3/W:0, group2/block2/conv1/W:0, group2/block2/conv2/W:0, group2/block2/conv3/W:0, group2/block3/conv1/W:0, group2/block3/conv2/W:0, group2/block3/conv3/W:0, group2/block4/conv1/W:0, group2/block4/conv2/W:0, group2/block4/conv3/W:0, group2/block5/conv1/W:0, group2/block5/conv2/W:0, group2/block5/conv3/W:0, group3/block0/conv1/W:0, group3/block0/conv2/W:0, group3/block0/conv3/W:0, group3/block0/convshortcut/W:0, group3/block1/conv1/W:0, group3/block1/conv2/W:0, group3/block1/conv3/W:0, group3/block2/conv1/W:0, group3/block2/conv2/W:0, group3/block2/conv3/W:0, fpn/lateral_1x1_c2/W:0, fpn/lateral_1x1_c3/W:0, fpn/lateral_1x1_c4/W:0, fpn/lateral_1x1_c5/W:0, fpn/posthoc_3x3_p2/W:0, fpn/posthoc_3x3_p3/W:0, fpn/posthoc_3x3_p4/W:0, fpn/posthoc_3x3_p5/W:0, rpn/conv0/W:0, rpn/class/W:0, rpn/box/W:0, fastrcnn/fc6/W:0, fastrcnn/fc7/W:0, fastrcnn/outputs/class/W:0, fastrcnn/outputs/box/W:0, maskrcnn/fcn0/W:0, maskrcnn/fcn1/W:0, maskrcnn/fcn2/W:0, maskrcnn/fcn3/W:0, maskrcnn/deconv/W:0, maskrcnn/conv/W:0 | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
W0308 18:59:03.309465 139697895950080 deprecation.py:323] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/array_grad.py:425: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Use tf.cast instead. | |
[32m[0308 18:59:03 @base.py:229][0m Creating the session ... | |
[32m[0308 18:59:03 @monitor.py:257][0m [5m[31mWRN[0m logger directory was not set. Ignore TFEventWriter. | |
[32m[0308 18:59:03 @monitor.py:298][0m [5m[31mWRN[0m logger directory was not set. Ignore JSONWriter. | |
[32m[0308 18:59:03 @model_utils.py:64][0m [36mTrainable Variables: | |
[0mname shape dim | |
------------------------------------- ------------------ -------- | |
group1/block0/conv1/W:0 [1, 1, 256, 128] 32768 | |
group1/block0/conv1/bn/gamma:0 [128] 128 | |
group1/block0/conv1/bn/beta:0 [128] 128 | |
group1/block0/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block0/conv2/bn/gamma:0 [128] 128 | |
group1/block0/conv2/bn/beta:0 [128] 128 | |
group1/block0/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block0/conv3/bn/gamma:0 [512] 512 | |
group1/block0/conv3/bn/beta:0 [512] 512 | |
group1/block0/convshortcut/W:0 [1, 1, 256, 512] 131072 | |
group1/block0/convshortcut/bn/gamma:0 [512] 512 | |
group1/block0/convshortcut/bn/beta:0 [512] 512 | |
group1/block1/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block1/conv1/bn/gamma:0 [128] 128 | |
group1/block1/conv1/bn/beta:0 [128] 128 | |
group1/block1/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block1/conv2/bn/gamma:0 [128] 128 | |
group1/block1/conv2/bn/beta:0 [128] 128 | |
group1/block1/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block1/conv3/bn/gamma:0 [512] 512 | |
group1/block1/conv3/bn/beta:0 [512] 512 | |
group1/block2/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block2/conv1/bn/gamma:0 [128] 128 | |
group1/block2/conv1/bn/beta:0 [128] 128 | |
group1/block2/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block2/conv2/bn/gamma:0 [128] 128 | |
group1/block2/conv2/bn/beta:0 [128] 128 | |
group1/block2/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block2/conv3/bn/gamma:0 [512] 512 | |
group1/block2/conv3/bn/beta:0 [512] 512 | |
group1/block3/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block3/conv1/bn/gamma:0 [128] 128 | |
group1/block3/conv1/bn/beta:0 [128] 128 | |
group1/block3/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block3/conv2/bn/gamma:0 [128] 128 | |
group1/block3/conv2/bn/beta:0 [128] 128 | |
group1/block3/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block3/conv3/bn/gamma:0 [512] 512 | |
group1/block3/conv3/bn/beta:0 [512] 512 | |
group2/block0/conv1/W:0 [1, 1, 512, 256] 131072 | |
group2/block0/conv1/bn/gamma:0 [256] 256 | |
group2/block0/conv1/bn/beta:0 [256] 256 | |
group2/block0/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block0/conv2/bn/gamma:0 [256] 256 | |
group2/block0/conv2/bn/beta:0 [256] 256 | |
group2/block0/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block0/conv3/bn/gamma:0 [1024] 1024 | |
group2/block0/conv3/bn/beta:0 [1024] 1024 | |
group2/block0/convshortcut/W:0 [1, 1, 512, 1024] 524288 | |
group2/block0/convshortcut/bn/gamma:0 [1024] 1024 | |
group2/block0/convshortcut/bn/beta:0 [1024] 1024 | |
group2/block1/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block1/conv1/bn/gamma:0 [256] 256 | |
group2/block1/conv1/bn/beta:0 [256] 256 | |
group2/block1/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block1/conv2/bn/gamma:0 [256] 256 | |
group2/block1/conv2/bn/beta:0 [256] 256 | |
group2/block1/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block1/conv3/bn/gamma:0 [1024] 1024 | |
group2/block1/conv3/bn/beta:0 [1024] 1024 | |
group2/block2/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block2/conv1/bn/gamma:0 [256] 256 | |
group2/block2/conv1/bn/beta:0 [256] 256 | |
group2/block2/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block2/conv2/bn/gamma:0 [256] 256 | |
group2/block2/conv2/bn/beta:0 [256] 256 | |
group2/block2/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block2/conv3/bn/gamma:0 [1024] 1024 | |
group2/block2/conv3/bn/beta:0 [1024] 1024 | |
group2/block3/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block3/conv1/bn/gamma:0 [256] 256 | |
group2/block3/conv1/bn/beta:0 [256] 256 | |
group2/block3/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block3/conv2/bn/gamma:0 [256] 256 | |
group2/block3/conv2/bn/beta:0 [256] 256 | |
group2/block3/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block3/conv3/bn/gamma:0 [1024] 1024 | |
group2/block3/conv3/bn/beta:0 [1024] 1024 | |
group2/block4/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block4/conv1/bn/gamma:0 [256] 256 | |
group2/block4/conv1/bn/beta:0 [256] 256 | |
group2/block4/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block4/conv2/bn/gamma:0 [256] 256 | |
group2/block4/conv2/bn/beta:0 [256] 256 | |
group2/block4/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block4/conv3/bn/gamma:0 [1024] 1024 | |
group2/block4/conv3/bn/beta:0 [1024] 1024 | |
group2/block5/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block5/conv1/bn/gamma:0 [256] 256 | |
group2/block5/conv1/bn/beta:0 [256] 256 | |
group2/block5/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block5/conv2/bn/gamma:0 [256] 256 | |
group2/block5/conv2/bn/beta:0 [256] 256 | |
group2/block5/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block5/conv3/bn/gamma:0 [1024] 1024 | |
group2/block5/conv3/bn/beta:0 [1024] 1024 | |
group3/block0/conv1/W:0 [1, 1, 1024, 512] 524288 | |
group3/block0/conv1/bn/gamma:0 [512] 512 | |
group3/block0/conv1/bn/beta:0 [512] 512 | |
group3/block0/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block0/conv2/bn/gamma:0 [512] 512 | |
group3/block0/conv2/bn/beta:0 [512] 512 | |
group3/block0/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block0/conv3/bn/gamma:0 [2048] 2048 | |
group3/block0/conv3/bn/beta:0 [2048] 2048 | |
group3/block0/convshortcut/W:0 [1, 1, 1024, 2048] 2097152 | |
group3/block0/convshortcut/bn/gamma:0 [2048] 2048 | |
group3/block0/convshortcut/bn/beta:0 [2048] 2048 | |
group3/block1/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block1/conv1/bn/gamma:0 [512] 512 | |
group3/block1/conv1/bn/beta:0 [512] 512 | |
group3/block1/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block1/conv2/bn/gamma:0 [512] 512 | |
group3/block1/conv2/bn/beta:0 [512] 512 | |
group3/block1/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block1/conv3/bn/gamma:0 [2048] 2048 | |
group3/block1/conv3/bn/beta:0 [2048] 2048 | |
group3/block2/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block2/conv1/bn/gamma:0 [512] 512 | |
group3/block2/conv1/bn/beta:0 [512] 512 | |
group3/block2/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block2/conv2/bn/gamma:0 [512] 512 | |
group3/block2/conv2/bn/beta:0 [512] 512 | |
group3/block2/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block2/conv3/bn/gamma:0 [2048] 2048 | |
group3/block2/conv3/bn/beta:0 [2048] 2048 | |
fpn/lateral_1x1_c2/W:0 [1, 1, 256, 256] 65536 | |
fpn/lateral_1x1_c2/b:0 [256] 256 | |
fpn/lateral_1x1_c3/W:0 [1, 1, 512, 256] 131072 | |
fpn/lateral_1x1_c3/b:0 [256] 256 | |
fpn/lateral_1x1_c4/W:0 [1, 1, 1024, 256] 262144 | |
fpn/lateral_1x1_c4/b:0 [256] 256 | |
fpn/lateral_1x1_c5/W:0 [1, 1, 2048, 256] 524288 | |
fpn/lateral_1x1_c5/b:0 [256] 256 | |
fpn/posthoc_3x3_p2/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p2/b:0 [256] 256 | |
fpn/posthoc_3x3_p3/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p3/b:0 [256] 256 | |
fpn/posthoc_3x3_p4/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p4/b:0 [256] 256 | |
fpn/posthoc_3x3_p5/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p5/b:0 [256] 256 | |
rpn/conv0/W:0 [3, 3, 256, 256] 589824 | |
rpn/conv0/b:0 [256] 256 | |
rpn/class/W:0 [1, 1, 256, 3] 768 | |
rpn/class/b:0 [3] 3 | |
rpn/box/W:0 [1, 1, 256, 12] 3072 | |
rpn/box/b:0 [12] 12 | |
fastrcnn/fc6/W:0 [12544, 1024] 12845056 | |
fastrcnn/fc6/b:0 [1024] 1024 | |
fastrcnn/fc7/W:0 [1024, 1024] 1048576 | |
fastrcnn/fc7/b:0 [1024] 1024 | |
fastrcnn/outputs/class/W:0 [1024, 81] 82944 | |
fastrcnn/outputs/class/b:0 [81] 81 | |
fastrcnn/outputs/box/W:0 [1024, 324] 331776 | |
fastrcnn/outputs/box/b:0 [324] 324 | |
maskrcnn/fcn0/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn0/b:0 [256] 256 | |
maskrcnn/fcn1/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn1/b:0 [256] 256 | |
maskrcnn/fcn2/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn2/b:0 [256] 256 | |
maskrcnn/fcn3/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn3/b:0 [256] 256 | |
maskrcnn/deconv/W:0 [2, 2, 256, 256] 262144 | |
maskrcnn/deconv/b:0 [256] 256 | |
maskrcnn/conv/W:0 [1, 1, 256, 80] 20480 | |
maskrcnn/conv/b:0 [80] 80[36m | |
Total #vars=168, #params=44175092, size=168.51MB[0m | |
[32m[0308 18:59:03 @base.py:160][0m [5m[31mWRN[0m Callback PeriodicCallback-ModelSaver is chief-only, skipped. | |
[32m[0308 18:59:03 @base.py:160][0m [5m[31mWRN[0m Callback EstimatedTimeLeft is chief-only, skipped. | |
[32m[0308 18:59:03 @base.py:160][0m [5m[31mWRN[0m Callback SessionRunTimeout is chief-only, skipped. | |
[32m[0308 18:59:03 @base.py:160][0m [5m[31mWRN[0m Callback ThroughputTracker is chief-only, skipped. | |
[32m[0308 18:59:03 @base.py:160][0m [5m[31mWRN[0m Callback MovingAverageSummary is chief-only, skipped. | |
[32m[0308 18:59:03 @base.py:160][0m [5m[31mWRN[0m Callback MergeAllSummaries_RunWithOp is chief-only, skipped. | |
[32m[0308 18:59:03 @base.py:208][0m Setup callbacks graph ... | |
[32m[0308 18:59:04 @monitor.py:257][0m [5m[31mWRN[0m logger directory was not set. Ignore TFEventWriter. | |
[32m[0308 18:59:04 @monitor.py:298][0m [5m[31mWRN[0m logger directory was not set. Ignore JSONWriter. | |
[32m[0308 18:59:04 @model_utils.py:64][0m [36mTrainable Variables: | |
[0mname shape dim | |
------------------------------------- ------------------ -------- | |
group1/block0/conv1/W:0 [1, 1, 256, 128] 32768 | |
group1/block0/conv1/bn/gamma:0 [128] 128 | |
group1/block0/conv1/bn/beta:0 [128] 128 | |
group1/block0/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block0/conv2/bn/gamma:0 [128] 128 | |
group1/block0/conv2/bn/beta:0 [128] 128 | |
group1/block0/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block0/conv3/bn/gamma:0 [512] 512 | |
group1/block0/conv3/bn/beta:0 [512] 512 | |
group1/block0/convshortcut/W:0 [1, 1, 256, 512] 131072 | |
group1/block0/convshortcut/bn/gamma:0 [512] 512 | |
group1/block0/convshortcut/bn/beta:0 [512] 512 | |
group1/block1/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block1/conv1/bn/gamma:0 [128] 128 | |
group1/block1/conv1/bn/beta:0 [128] 128 | |
group1/block1/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block1/conv2/bn/gamma:0 [128] 128 | |
group1/block1/conv2/bn/beta:0 [128] 128 | |
group1/block1/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block1/conv3/bn/gamma:0 [512] 512 | |
group1/block1/conv3/bn/beta:0 [512] 512 | |
group1/block2/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block2/conv1/bn/gamma:0 [128] 128 | |
group1/block2/conv1/bn/beta:0 [128] 128 | |
group1/block2/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block2/conv2/bn/gamma:0 [128] 128 | |
group1/block2/conv2/bn/beta:0 [128] 128 | |
group1/block2/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block2/conv3/bn/gamma:0 [512] 512 | |
group1/block2/conv3/bn/beta:0 [512] 512 | |
group1/block3/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block3/conv1/bn/gamma:0 [128] 128 | |
group1/block3/conv1/bn/beta:0 [128] 128 | |
group1/block3/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block3/conv2/bn/gamma:0 [128] 128 | |
group1/block3/conv2/bn/beta:0 [128] 128 | |
group1/block3/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block3/conv3/bn/gamma:0 [512] 512 | |
group1/block3/conv3/bn/beta:0 [512] 512 | |
group2/block0/conv1/W:0 [1, 1, 512, 256] 131072 | |
group2/block0/conv1/bn/gamma:0 [256] 256 | |
group2/block0/conv1/bn/beta:0 [256] 256 | |
group2/block0/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block0/conv2/bn/gamma:0 [256] 256 | |
group2/block0/conv2/bn/beta:0 [256] 256 | |
group2/block0/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block0/conv3/bn/gamma:0 [1024] 1024 | |
group2/block0/conv3/bn/beta:0 [1024] 1024 | |
group2/block0/convshortcut/W:0 [1, 1, 512, 1024] 524288 | |
group2/block0/convshortcut/bn/gamma:0 [1024] 1024 | |
group2/block0/convshortcut/bn/beta:0 [1024] 1024 | |
group2/block1/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block1/conv1/bn/gamma:0 [256] 256 | |
group2/block1/conv1/bn/beta:0 [256] 256 | |
group2/block1/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block1/conv2/bn/gamma:0 [256] 256 | |
group2/block1/conv2/bn/beta:0 [256] 256 | |
group2/block1/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block1/conv3/bn/gamma:0 [1024] 1024 | |
group2/block1/conv3/bn/beta:0 [1024] 1024 | |
group2/block2/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block2/conv1/bn/gamma:0 [256] 256 | |
group2/block2/conv1/bn/beta:0 [256] 256 | |
group2/block2/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block2/conv2/bn/gamma:0 [256] 256 | |
group2/block2/conv2/bn/beta:0 [256] 256 | |
group2/block2/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block2/conv3/bn/gamma:0 [1024] 1024 | |
group2/block2/conv3/bn/beta:0 [1024] 1024 | |
group2/block3/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block3/conv1/bn/gamma:0 [256] 256 | |
group2/block3/conv1/bn/beta:0 [256] 256 | |
group2/block3/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block3/conv2/bn/gamma:0 [256] 256 | |
group2/block3/conv2/bn/beta:0 [256] 256 | |
group2/block3/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block3/conv3/bn/gamma:0 [1024] 1024 | |
group2/block3/conv3/bn/beta:0 [1024] 1024 | |
group2/block4/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block4/conv1/bn/gamma:0 [256] 256 | |
group2/block4/conv1/bn/beta:0 [256] 256 | |
group2/block4/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block4/conv2/bn/gamma:0 [256] 256 | |
group2/block4/conv2/bn/beta:0 [256] 256 | |
group2/block4/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block4/conv3/bn/gamma:0 [1024] 1024 | |
group2/block4/conv3/bn/beta:0 [1024] 1024 | |
group2/block5/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block5/conv1/bn/gamma:0 [256] 256 | |
group2/block5/conv1/bn/beta:0 [256] 256 | |
group2/block5/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block5/conv2/bn/gamma:0 [256] 256 | |
group2/block5/conv2/bn/beta:0 [256] 256 | |
group2/block5/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block5/conv3/bn/gamma:0 [1024] 1024 | |
group2/block5/conv3/bn/beta:0 [1024] 1024 | |
group3/block0/conv1/W:0 [1, 1, 1024, 512] 524288 | |
group3/block0/conv1/bn/gamma:0 [512] 512 | |
group3/block0/conv1/bn/beta:0 [512] 512 | |
group3/block0/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block0/conv2/bn/gamma:0 [512] 512 | |
group3/block0/conv2/bn/beta:0 [512] 512 | |
group3/block0/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block0/conv3/bn/gamma:0 [2048] 2048 | |
group3/block0/conv3/bn/beta:0 [2048] 2048 | |
group3/block0/convshortcut/W:0 [1, 1, 1024, 2048] 2097152 | |
group3/block0/convshortcut/bn/gamma:0 [2048] 2048 | |
group3/block0/convshortcut/bn/beta:0 [2048] 2048 | |
group3/block1/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block1/conv1/bn/gamma:0 [512] 512 | |
group3/block1/conv1/bn/beta:0 [512] 512 | |
group3/block1/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block1/conv2/bn/gamma:0 [512] 512 | |
group3/block1/conv2/bn/beta:0 [512] 512 | |
group3/block1/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block1/conv3/bn/gamma:0 [2048] 2048 | |
group3/block1/conv3/bn/beta:0 [2048] 2048 | |
group3/block2/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block2/conv1/bn/gamma:0 [512] 512 | |
group3/block2/conv1/bn/beta:0 [512] 512 | |
group3/block2/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block2/conv2/bn/gamma:0 [512] 512 | |
group3/block2/conv2/bn/beta:0 [512] 512 | |
group3/block2/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block2/conv3/bn/gamma:0 [2048] 2048 | |
group3/block2/conv3/bn/beta:0 [2048] 2048 | |
fpn/lateral_1x1_c2/W:0 [1, 1, 256, 256] 65536 | |
fpn/lateral_1x1_c2/b:0 [256] 256 | |
fpn/lateral_1x1_c3/W:0 [1, 1, 512, 256] 131072 | |
fpn/lateral_1x1_c3/b:0 [256] 256 | |
fpn/lateral_1x1_c4/W:0 [1, 1, 1024, 256] 262144 | |
fpn/lateral_1x1_c4/b:0 [256] 256 | |
fpn/lateral_1x1_c5/W:0 [1, 1, 2048, 256] 524288 | |
fpn/lateral_1x1_c5/b:0 [256] 256 | |
fpn/posthoc_3x3_p2/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p2/b:0 [256] 256 | |
fpn/posthoc_3x3_p3/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p3/b:0 [256] 256 | |
fpn/posthoc_3x3_p4/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p4/b:0 [256] 256 | |
fpn/posthoc_3x3_p5/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p5/b:0 [256] 256 | |
rpn/conv0/W:0 [3, 3, 256, 256] 589824 | |
rpn/conv0/b:0 [256] 256 | |
rpn/class/W:0 [1, 1, 256, 3] 768 | |
rpn/class/b:0 [3] 3 | |
rpn/box/W:0 [1, 1, 256, 12] 3072 | |
rpn/box/b:0 [12] 12 | |
fastrcnn/fc6/W:0 [12544, 1024] 12845056 | |
fastrcnn/fc6/b:0 [1024] 1024 | |
fastrcnn/fc7/W:0 [1024, 1024] 1048576 | |
fastrcnn/fc7/b:0 [1024] 1024 | |
fastrcnn/outputs/class/W:0 [1024, 81] 82944 | |
fastrcnn/outputs/class/b:0 [81] 81 | |
fastrcnn/outputs/box/W:0 [1024, 324] 331776 | |
fastrcnn/outputs/box/b:0 [324] 324 | |
maskrcnn/fcn0/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn0/b:0 [256] 256 | |
maskrcnn/fcn1/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn1/b:0 [256] 256 | |
maskrcnn/fcn2/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn2/b:0 [256] 256 | |
maskrcnn/fcn3/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn3/b:0 [256] 256 | |
maskrcnn/deconv/W:0 [2, 2, 256, 256] 262144 | |
maskrcnn/deconv/b:0 [256] 256 | |
maskrcnn/conv/W:0 [1, 1, 256, 80] 20480 | |
maskrcnn/conv/b:0 [80] 80[36m | |
Total #vars=168, #params=44175092, size=168.51MB[0m | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback PeriodicCallback-ModelSaver is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback EstimatedTimeLeft is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback SessionRunTimeout is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback ThroughputTracker is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback MovingAverageSummary is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback MergeAllSummaries_RunWithOp is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:208][0m Setup callbacks graph ... | |
[32m[0308 18:59:04 @base.py:229][0m Creating the session ... | |
[32m[0308 18:59:04 @monitor.py:257][0m [5m[31mWRN[0m logger directory was not set. Ignore TFEventWriter. | |
[32m[0308 18:59:04 @monitor.py:298][0m [5m[31mWRN[0m logger directory was not set. Ignore JSONWriter. | |
[32m[0308 18:59:04 @model_utils.py:64][0m [36mTrainable Variables: | |
[0mname shape dim | |
------------------------------------- ------------------ -------- | |
group1/block0/conv1/W:0 [1, 1, 256, 128] 32768 | |
group1/block0/conv1/bn/gamma:0 [128] 128 | |
group1/block0/conv1/bn/beta:0 [128] 128 | |
group1/block0/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block0/conv2/bn/gamma:0 [128] 128 | |
group1/block0/conv2/bn/beta:0 [128] 128 | |
group1/block0/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block0/conv3/bn/gamma:0 [512] 512 | |
group1/block0/conv3/bn/beta:0 [512] 512 | |
group1/block0/convshortcut/W:0 [1, 1, 256, 512] 131072 | |
group1/block0/convshortcut/bn/gamma:0 [512] 512 | |
group1/block0/convshortcut/bn/beta:0 [512] 512 | |
group1/block1/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block1/conv1/bn/gamma:0 [128] 128 | |
group1/block1/conv1/bn/beta:0 [128] 128 | |
group1/block1/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block1/conv2/bn/gamma:0 [128] 128 | |
group1/block1/conv2/bn/beta:0 [128] 128 | |
group1/block1/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block1/conv3/bn/gamma:0 [512] 512 | |
group1/block1/conv3/bn/beta:0 [512] 512 | |
group1/block2/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block2/conv1/bn/gamma:0 [128] 128 | |
group1/block2/conv1/bn/beta:0 [128] 128 | |
group1/block2/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block2/conv2/bn/gamma:0 [128] 128 | |
group1/block2/conv2/bn/beta:0 [128] 128 | |
group1/block2/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block2/conv3/bn/gamma:0 [512] 512 | |
group1/block2/conv3/bn/beta:0 [512] 512 | |
group1/block3/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block3/conv1/bn/gamma:0 [128] 128 | |
group1/block3/conv1/bn/beta:0 [128] 128 | |
group1/block3/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block3/conv2/bn/gamma:0 [128] 128 | |
group1/block3/conv2/bn/beta:0 [128] 128 | |
group1/block3/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block3/conv3/bn/gamma:0 [512] 512 | |
group1/block3/conv3/bn/beta:0 [512] 512 | |
group2/block0/conv1/W:0 [1, 1, 512, 256] 131072 | |
group2/block0/conv1/bn/gamma:0 [256] 256 | |
group2/block0/conv1/bn/beta:0 [256] 256 | |
group2/block0/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block0/conv2/bn/gamma:0 [256] 256 | |
group2/block0/conv2/bn/beta:0 [256] 256 | |
group2/block0/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block0/conv3/bn/gamma:0 [1024] 1024 | |
group2/block0/conv3/bn/beta:0 [1024] 1024 | |
group2/block0/convshortcut/W:0 [1, 1, 512, 1024] 524288 | |
group2/block0/convshortcut/bn/gamma:0 [1024] 1024 | |
group2/block0/convshortcut/bn/beta:0 [1024] 1024 | |
group2/block1/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block1/conv1/bn/gamma:0 [256] 256 | |
group2/block1/conv1/bn/beta:0 [256] 256 | |
group2/block1/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block1/conv2/bn/gamma:0 [256] 256 | |
group2/block1/conv2/bn/beta:0 [256] 256 | |
group2/block1/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block1/conv3/bn/gamma:0 [1024] 1024 | |
group2/block1/conv3/bn/beta:0 [1024] 1024 | |
group2/block2/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block2/conv1/bn/gamma:0 [256] 256 | |
group2/block2/conv1/bn/beta:0 [256] 256 | |
group2/block2/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block2/conv2/bn/gamma:0 [256] 256 | |
group2/block2/conv2/bn/beta:0 [256] 256 | |
group2/block2/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block2/conv3/bn/gamma:0 [1024] 1024 | |
group2/block2/conv3/bn/beta:0 [1024] 1024 | |
group2/block3/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block3/conv1/bn/gamma:0 [256] 256 | |
group2/block3/conv1/bn/beta:0 [256] 256 | |
group2/block3/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block3/conv2/bn/gamma:0 [256] 256 | |
group2/block3/conv2/bn/beta:0 [256] 256 | |
group2/block3/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block3/conv3/bn/gamma:0 [1024] 1024 | |
group2/block3/conv3/bn/beta:0 [1024] 1024 | |
group2/block4/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block4/conv1/bn/gamma:0 [256] 256 | |
group2/block4/conv1/bn/beta:0 [256] 256 | |
group2/block4/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block4/conv2/bn/gamma:0 [256] 256 | |
group2/block4/conv2/bn/beta:0 [256] 256 | |
group2/block4/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block4/conv3/bn/gamma:0 [1024] 1024 | |
group2/block4/conv3/bn/beta:0 [1024] 1024 | |
group2/block5/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block5/conv1/bn/gamma:0 [256] 256 | |
group2/block5/conv1/bn/beta:0 [256] 256 | |
group2/block5/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block5/conv2/bn/gamma:0 [256] 256 | |
group2/block5/conv2/bn/beta:0 [256] 256 | |
group2/block5/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block5/conv3/bn/gamma:0 [1024] 1024 | |
group2/block5/conv3/bn/beta:0 [1024] 1024 | |
group3/block0/conv1/W:0 [1, 1, 1024, 512] 524288 | |
group3/block0/conv1/bn/gamma:0 [512] 512 | |
group3/block0/conv1/bn/beta:0 [512] 512 | |
group3/block0/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block0/conv2/bn/gamma:0 [512] 512 | |
group3/block0/conv2/bn/beta:0 [512] 512 | |
group3/block0/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block0/conv3/bn/gamma:0 [2048] 2048 | |
group3/block0/conv3/bn/beta:0 [2048] 2048 | |
group3/block0/convshortcut/W:0 [1, 1, 1024, 2048] 2097152 | |
group3/block0/convshortcut/bn/gamma:0 [2048] 2048 | |
group3/block0/convshortcut/bn/beta:0 [2048] 2048 | |
group3/block1/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block1/conv1/bn/gamma:0 [512] 512 | |
group3/block1/conv1/bn/beta:0 [512] 512 | |
group3/block1/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block1/conv2/bn/gamma:0 [512] 512 | |
group3/block1/conv2/bn/beta:0 [512] 512 | |
group3/block1/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block1/conv3/bn/gamma:0 [2048] 2048 | |
group3/block1/conv3/bn/beta:0 [2048] 2048 | |
group3/block2/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block2/conv1/bn/gamma:0 [512] 512 | |
group3/block2/conv1/bn/beta:0 [512] 512 | |
group3/block2/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block2/conv2/bn/gamma:0 [512] 512 | |
group3/block2/conv2/bn/beta:0 [512] 512 | |
group3/block2/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block2/conv3/bn/gamma:0 [2048] 2048 | |
group3/block2/conv3/bn/beta:0 [2048] 2048 | |
fpn/lateral_1x1_c2/W:0 [1, 1, 256, 256] 65536 | |
fpn/lateral_1x1_c2/b:0 [256] 256 | |
fpn/lateral_1x1_c3/W:0 [1, 1, 512, 256] 131072 | |
fpn/lateral_1x1_c3/b:0 [256] 256 | |
fpn/lateral_1x1_c4/W:0 [1, 1, 1024, 256] 262144 | |
fpn/lateral_1x1_c4/b:0 [256] 256 | |
fpn/lateral_1x1_c5/W:0 [1, 1, 2048, 256] 524288 | |
fpn/lateral_1x1_c5/b:0 [256] 256 | |
fpn/posthoc_3x3_p2/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p2/b:0 [256] 256 | |
fpn/posthoc_3x3_p3/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p3/b:0 [256] 256 | |
fpn/posthoc_3x3_p4/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p4/b:0 [256] 256 | |
fpn/posthoc_3x3_p5/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p5/b:0 [256] 256 | |
rpn/conv0/W:0 [3, 3, 256, 256] 589824 | |
rpn/conv0/b:0 [256] 256 | |
rpn/class/W:0 [1, 1, 256, 3] 768 | |
rpn/class/b:0 [3] 3 | |
rpn/box/W:0 [1, 1, 256, 12] 3072 | |
rpn/box/b:0 [12] 12 | |
fastrcnn/fc6/W:0 [12544, 1024] 12845056 | |
fastrcnn/fc6/b:0 [1024] 1024 | |
fastrcnn/fc7/W:0 [1024, 1024] 1048576 | |
fastrcnn/fc7/b:0 [1024] 1024 | |
fastrcnn/outputs/class/W:0 [1024, 81] 82944 | |
fastrcnn/outputs/class/b:0 [81] 81 | |
fastrcnn/outputs/box/W:0 [1024, 324] 331776 | |
fastrcnn/outputs/box/b:0 [324] 324 | |
maskrcnn/fcn0/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn0/b:0 [256] 256 | |
maskrcnn/fcn1/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn1/b:0 [256] 256 | |
maskrcnn/fcn2/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn2/b:0 [256] 256 | |
maskrcnn/fcn3/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn3/b:0 [256] 256 | |
maskrcnn/deconv/W:0 [2, 2, 256, 256] 262144 | |
maskrcnn/deconv/b:0 [256] 256 | |
maskrcnn/conv/W:0 [1, 1, 256, 80] 20480 | |
maskrcnn/conv/b:0 [80] 80[36m | |
Total #vars=168, #params=44175092, size=168.51MB[0m | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback PeriodicCallback-ModelSaver is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback EstimatedTimeLeft is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback SessionRunTimeout is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback ThroughputTracker is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback MovingAverageSummary is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:160][0m [5m[31mWRN[0m Callback MergeAllSummaries_RunWithOp is chief-only, skipped. | |
[32m[0308 18:59:04 @base.py:208][0m Setup callbacks graph ... | |
[32m[0308 18:59:05 @base.py:229][0m Creating the session ... | |
[32m[0308 18:59:05 @base.py:229][0m Creating the session ... | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:59:07 @monitor.py:257][0m [5m[31mWRN[0m logger directory was not set. Ignore TFEventWriter. | |
[32m[0308 18:59:07 @monitor.py:298][0m [5m[31mWRN[0m logger directory was not set. Ignore JSONWriter. | |
[32m[0308 18:59:07 @model_utils.py:64][0m [36mTrainable Variables: | |
[0mname shape dim | |
------------------------------------- ------------------ -------- | |
group1/block0/conv1/W:0 [1, 1, 256, 128] 32768 | |
group1/block0/conv1/bn/gamma:0 [128] 128 | |
group1/block0/conv1/bn/beta:0 [128] 128 | |
group1/block0/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block0/conv2/bn/gamma:0 [128] 128 | |
group1/block0/conv2/bn/beta:0 [128] 128 | |
group1/block0/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block0/conv3/bn/gamma:0 [512] 512 | |
group1/block0/conv3/bn/beta:0 [512] 512 | |
group1/block0/convshortcut/W:0 [1, 1, 256, 512] 131072 | |
group1/block0/convshortcut/bn/gamma:0 [512] 512 | |
group1/block0/convshortcut/bn/beta:0 [512] 512 | |
group1/block1/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block1/conv1/bn/gamma:0 [128] 128 | |
group1/block1/conv1/bn/beta:0 [128] 128 | |
group1/block1/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block1/conv2/bn/gamma:0 [128] 128 | |
group1/block1/conv2/bn/beta:0 [128] 128 | |
group1/block1/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block1/conv3/bn/gamma:0 [512] 512 | |
group1/block1/conv3/bn/beta:0 [512] 512 | |
group1/block2/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block2/conv1/bn/gamma:0 [128] 128 | |
group1/block2/conv1/bn/beta:0 [128] 128 | |
group1/block2/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block2/conv2/bn/gamma:0 [128] 128 | |
group1/block2/conv2/bn/beta:0 [128] 128 | |
group1/block2/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block2/conv3/bn/gamma:0 [512] 512 | |
group1/block2/conv3/bn/beta:0 [512] 512 | |
group1/block3/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block3/conv1/bn/gamma:0 [128] 128 | |
group1/block3/conv1/bn/beta:0 [128] 128 | |
group1/block3/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block3/conv2/bn/gamma:0 [128] 128 | |
group1/block3/conv2/bn/beta:0 [128] 128 | |
group1/block3/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block3/conv3/bn/gamma:0 [512] 512 | |
group1/block3/conv3/bn/beta:0 [512] 512 | |
group2/block0/conv1/W:0 [1, 1, 512, 256] 131072 | |
group2/block0/conv1/bn/gamma:0 [256] 256 | |
group2/block0/conv1/bn/beta:0 [256] 256 | |
group2/block0/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block0/conv2/bn/gamma:0 [256] 256 | |
group2/block0/conv2/bn/beta:0 [256] 256 | |
group2/block0/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block0/conv3/bn/gamma:0 [1024] 1024 | |
group2/block0/conv3/bn/beta:0 [1024] 1024 | |
group2/block0/convshortcut/W:0 [1, 1, 512, 1024] 524288 | |
group2/block0/convshortcut/bn/gamma:0 [1024] 1024 | |
group2/block0/convshortcut/bn/beta:0 [1024] 1024 | |
group2/block1/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block1/conv1/bn/gamma:0 [256] 256 | |
group2/block1/conv1/bn/beta:0 [256] 256 | |
group2/block1/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block1/conv2/bn/gamma:0 [256] 256 | |
group2/block1/conv2/bn/beta:0 [256] 256 | |
group2/block1/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block1/conv3/bn/gamma:0 [1024] 1024 | |
group2/block1/conv3/bn/beta:0 [1024] 1024 | |
group2/block2/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block2/conv1/bn/gamma:0 [256] 256 | |
group2/block2/conv1/bn/beta:0 [256] 256 | |
group2/block2/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block2/conv2/bn/gamma:0 [256] 256 | |
group2/block2/conv2/bn/beta:0 [256] 256 | |
group2/block2/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block2/conv3/bn/gamma:0 [1024] 1024 | |
group2/block2/conv3/bn/beta:0 [1024] 1024 | |
group2/block3/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block3/conv1/bn/gamma:0 [256] 256 | |
group2/block3/conv1/bn/beta:0 [256] 256 | |
group2/block3/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block3/conv2/bn/gamma:0 [256] 256 | |
group2/block3/conv2/bn/beta:0 [256] 256 | |
group2/block3/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block3/conv3/bn/gamma:0 [1024] 1024 | |
group2/block3/conv3/bn/beta:0 [1024] 1024 | |
group2/block4/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block4/conv1/bn/gamma:0 [256] 256 | |
group2/block4/conv1/bn/beta:0 [256] 256 | |
group2/block4/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block4/conv2/bn/gamma:0 [256] 256 | |
group2/block4/conv2/bn/beta:0 [256] 256 | |
group2/block4/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block4/conv3/bn/gamma:0 [1024] 1024 | |
group2/block4/conv3/bn/beta:0 [1024] 1024 | |
group2/block5/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block5/conv1/bn/gamma:0 [256] 256 | |
group2/block5/conv1/bn/beta:0 [256] 256 | |
group2/block5/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block5/conv2/bn/gamma:0 [256] 256 | |
group2/block5/conv2/bn/beta:0 [256] 256 | |
group2/block5/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block5/conv3/bn/gamma:0 [1024] 1024 | |
group2/block5/conv3/bn/beta:0 [1024] 1024 | |
group3/block0/conv1/W:0 [1, 1, 1024, 512] 524288 | |
group3/block0/conv1/bn/gamma:0 [512] 512 | |
group3/block0/conv1/bn/beta:0 [512] 512 | |
group3/block0/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block0/conv2/bn/gamma:0 [512] 512 | |
group3/block0/conv2/bn/beta:0 [512] 512 | |
group3/block0/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block0/conv3/bn/gamma:0 [2048] 2048 | |
group3/block0/conv3/bn/beta:0 [2048] 2048 | |
group3/block0/convshortcut/W:0 [1, 1, 1024, 2048] 2097152 | |
group3/block0/convshortcut/bn/gamma:0 [2048] 2048 | |
group3/block0/convshortcut/bn/beta:0 [2048] 2048 | |
group3/block1/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block1/conv1/bn/gamma:0 [512] 512 | |
group3/block1/conv1/bn/beta:0 [512] 512 | |
group3/block1/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block1/conv2/bn/gamma:0 [512] 512 | |
group3/block1/conv2/bn/beta:0 [512] 512 | |
group3/block1/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block1/conv3/bn/gamma:0 [2048] 2048 | |
group3/block1/conv3/bn/beta:0 [2048] 2048 | |
group3/block2/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block2/conv1/bn/gamma:0 [512] 512 | |
group3/block2/conv1/bn/beta:0 [512] 512 | |
group3/block2/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block2/conv2/bn/gamma:0 [512] 512 | |
group3/block2/conv2/bn/beta:0 [512] 512 | |
group3/block2/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block2/conv3/bn/gamma:0 [2048] 2048 | |
group3/block2/conv3/bn/beta:0 [2048] 2048 | |
fpn/lateral_1x1_c2/W:0 [1, 1, 256, 256] 65536 | |
fpn/lateral_1x1_c2/b:0 [256] 256 | |
fpn/lateral_1x1_c3/W:0 [1, 1, 512, 256] 131072 | |
fpn/lateral_1x1_c3/b:0 [256] 256 | |
fpn/lateral_1x1_c4/W:0 [1, 1, 1024, 256] 262144 | |
fpn/lateral_1x1_c4/b:0 [256] 256 | |
fpn/lateral_1x1_c5/W:0 [1, 1, 2048, 256] 524288 | |
fpn/lateral_1x1_c5/b:0 [256] 256 | |
fpn/posthoc_3x3_p2/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p2/b:0 [256] 256 | |
fpn/posthoc_3x3_p3/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p3/b:0 [256] 256 | |
fpn/posthoc_3x3_p4/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p4/b:0 [256] 256 | |
fpn/posthoc_3x3_p5/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p5/b:0 [256] 256 | |
rpn/conv0/W:0 [3, 3, 256, 256] 589824 | |
rpn/conv0/b:0 [256] 256 | |
rpn/class/W:0 [1, 1, 256, 3] 768 | |
rpn/class/b:0 [3] 3 | |
rpn/box/W:0 [1, 1, 256, 12] 3072 | |
rpn/box/b:0 [12] 12 | |
fastrcnn/fc6/W:0 [12544, 1024] 12845056 | |
fastrcnn/fc6/b:0 [1024] 1024 | |
fastrcnn/fc7/W:0 [1024, 1024] 1048576 | |
fastrcnn/fc7/b:0 [1024] 1024 | |
fastrcnn/outputs/class/W:0 [1024, 81] 82944 | |
fastrcnn/outputs/class/b:0 [81] 81 | |
fastrcnn/outputs/box/W:0 [1024, 324] 331776 | |
fastrcnn/outputs/box/b:0 [324] 324 | |
maskrcnn/fcn0/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn0/b:0 [256] 256 | |
maskrcnn/fcn1/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn1/b:0 [256] 256 | |
maskrcnn/fcn2/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn2/b:0 [256] 256 | |
maskrcnn/fcn3/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn3/b:0 [256] 256 | |
maskrcnn/deconv/W:0 [2, 2, 256, 256] 262144 | |
maskrcnn/deconv/b:0 [256] 256 | |
maskrcnn/conv/W:0 [1, 1, 256, 80] 20480 | |
maskrcnn/conv/b:0 [80] 80[36m | |
Total #vars=168, #params=44175092, size=168.51MB[0m | |
[32m[0308 18:59:07 @base.py:160][0m [5m[31mWRN[0m Callback PeriodicCallback-ModelSaver is chief-only, skipped. | |
[32m[0308 18:59:07 @base.py:160][0m [5m[31mWRN[0m Callback EstimatedTimeLeft is chief-only, skipped. | |
[32m[0308 18:59:07 @base.py:160][0m [5m[31mWRN[0m Callback SessionRunTimeout is chief-only, skipped. | |
[32m[0308 18:59:07 @base.py:160][0m [5m[31mWRN[0m Callback ThroughputTracker is chief-only, skipped. | |
[32m[0308 18:59:07 @base.py:160][0m [5m[31mWRN[0m Callback MovingAverageSummary is chief-only, skipped. | |
[32m[0308 18:59:07 @base.py:160][0m [5m[31mWRN[0m Callback MergeAllSummaries_RunWithOp is chief-only, skipped. | |
[32m[0308 18:59:07 @base.py:208][0m Setup callbacks graph ... | |
[tshape] model_box.encode_bbox_target.boxes: (?, ?, ?, ?, ?) | |
[tshape] model_box.encode_bbox_target.anchors: (?, ?, ?, ?, ?) | |
[32m[0308 18:59:08 @model_utils.py:64][0m [36mTrainable Variables: | |
[0mname shape dim | |
------------------------------------- ------------------ -------- | |
group1/block0/conv1/W:0 [1, 1, 256, 128] 32768 | |
group1/block0/conv1/bn/gamma:0 [128] 128 | |
group1/block0/conv1/bn/beta:0 [128] 128 | |
group1/block0/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block0/conv2/bn/gamma:0 [128] 128 | |
group1/block0/conv2/bn/beta:0 [128] 128 | |
group1/block0/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block0/conv3/bn/gamma:0 [512] 512 | |
group1/block0/conv3/bn/beta:0 [512] 512 | |
group1/block0/convshortcut/W:0 [1, 1, 256, 512] 131072 | |
group1/block0/convshortcut/bn/gamma:0 [512] 512 | |
group1/block0/convshortcut/bn/beta:0 [512] 512 | |
group1/block1/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block1/conv1/bn/gamma:0 [128] 128 | |
group1/block1/conv1/bn/beta:0 [128] 128 | |
group1/block1/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block1/conv2/bn/gamma:0 [128] 128 | |
group1/block1/conv2/bn/beta:0 [128] 128 | |
group1/block1/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block1/conv3/bn/gamma:0 [512] 512 | |
group1/block1/conv3/bn/beta:0 [512] 512 | |
group1/block2/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block2/conv1/bn/gamma:0 [128] 128 | |
group1/block2/conv1/bn/beta:0 [128] 128 | |
group1/block2/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block2/conv2/bn/gamma:0 [128] 128 | |
group1/block2/conv2/bn/beta:0 [128] 128 | |
group1/block2/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block2/conv3/bn/gamma:0 [512] 512 | |
group1/block2/conv3/bn/beta:0 [512] 512 | |
group1/block3/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block3/conv1/bn/gamma:0 [128] 128 | |
group1/block3/conv1/bn/beta:0 [128] 128 | |
group1/block3/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block3/conv2/bn/gamma:0 [128] 128 | |
group1/block3/conv2/bn/beta:0 [128] 128 | |
group1/block3/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block3/conv3/bn/gamma:0 [512] 512 | |
group1/block3/conv3/bn/beta:0 [512] 512 | |
group2/block0/conv1/W:0 [1, 1, 512, 256] 131072 | |
group2/block0/conv1/bn/gamma:0 [256] 256 | |
group2/block0/conv1/bn/beta:0 [256] 256 | |
group2/block0/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block0/conv2/bn/gamma:0 [256] 256 | |
group2/block0/conv2/bn/beta:0 [256] 256 | |
group2/block0/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block0/conv3/bn/gamma:0 [1024] 1024 | |
group2/block0/conv3/bn/beta:0 [1024] 1024 | |
group2/block0/convshortcut/W:0 [1, 1, 512, 1024] 524288 | |
group2/block0/convshortcut/bn/gamma:0 [1024] 1024 | |
group2/block0/convshortcut/bn/beta:0 [1024] 1024 | |
group2/block1/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block1/conv1/bn/gamma:0 [256] 256 | |
group2/block1/conv1/bn/beta:0 [256] 256 | |
group2/block1/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block1/conv2/bn/gamma:0 [256] 256 | |
group2/block1/conv2/bn/beta:0 [256] 256 | |
group2/block1/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block1/conv3/bn/gamma:0 [1024] 1024 | |
group2/block1/conv3/bn/beta:0 [1024] 1024 | |
group2/block2/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block2/conv1/bn/gamma:0 [256] 256 | |
group2/block2/conv1/bn/beta:0 [256] 256 | |
group2/block2/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block2/conv2/bn/gamma:0 [256] 256 | |
group2/block2/conv2/bn/beta:0 [256] 256 | |
group2/block2/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block2/conv3/bn/gamma:0 [1024] 1024 | |
group2/block2/conv3/bn/beta:0 [1024] 1024 | |
group2/block3/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block3/conv1/bn/gamma:0 [256] 256 | |
group2/block3/conv1/bn/beta:0 [256] 256 | |
group2/block3/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block3/conv2/bn/gamma:0 [256] 256 | |
group2/block3/conv2/bn/beta:0 [256] 256 | |
group2/block3/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block3/conv3/bn/gamma:0 [1024] 1024 | |
group2/block3/conv3/bn/beta:0 [1024] 1024 | |
group2/block4/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block4/conv1/bn/gamma:0 [256] 256 | |
group2/block4/conv1/bn/beta:0 [256] 256 | |
group2/block4/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block4/conv2/bn/gamma:0 [256] 256 | |
group2/block4/conv2/bn/beta:0 [256] 256 | |
group2/block4/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block4/conv3/bn/gamma:0 [1024] 1024 | |
group2/block4/conv3/bn/beta:0 [1024] 1024 | |
group2/block5/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block5/conv1/bn/gamma:0 [256] 256 | |
group2/block5/conv1/bn/beta:0 [256] 256 | |
group2/block5/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block5/conv2/bn/gamma:0 [256] 256 | |
group2/block5/conv2/bn/beta:0 [256] 256 | |
group2/block5/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block5/conv3/bn/gamma:0 [1024] 1024 | |
group2/block5/conv3/bn/beta:0 [1024] 1024 | |
group3/block0/conv1/W:0 [1, 1, 1024, 512] 524288 | |
group3/block0/conv1/bn/gamma:0 [512] 512 | |
group3/block0/conv1/bn/beta:0 [512] 512 | |
group3/block0/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block0/conv2/bn/gamma:0 [512] 512 | |
group3/block0/conv2/bn/beta:0 [512] 512 | |
group3/block0/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block0/conv3/bn/gamma:0 [2048] 2048 | |
group3/block0/conv3/bn/beta:0 [2048] 2048 | |
group3/block0/convshortcut/W:0 [1, 1, 1024, 2048] 2097152 | |
group3/block0/convshortcut/bn/gamma:0 [2048] 2048 | |
group3/block0/convshortcut/bn/beta:0 [2048] 2048 | |
group3/block1/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block1/conv1/bn/gamma:0 [512] 512 | |
group3/block1/conv1/bn/beta:0 [512] 512 | |
group3/block1/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block1/conv2/bn/gamma:0 [512] 512 | |
group3/block1/conv2/bn/beta:0 [512] 512 | |
group3/block1/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block1/conv3/bn/gamma:0 [2048] 2048 | |
group3/block1/conv3/bn/beta:0 [2048] 2048 | |
group3/block2/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block2/conv1/bn/gamma:0 [512] 512 | |
group3/block2/conv1/bn/beta:0 [512] 512 | |
group3/block2/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block2/conv2/bn/gamma:0 [512] 512 | |
group3/block2/conv2/bn/beta:0 [512] 512 | |
group3/block2/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block2/conv3/bn/gamma:0 [2048] 2048 | |
group3/block2/conv3/bn/beta:0 [2048] 2048 | |
fpn/lateral_1x1_c2/W:0 [1, 1, 256, 256] 65536 | |
fpn/lateral_1x1_c2/b:0 [256] 256 | |
fpn/lateral_1x1_c3/W:0 [1, 1, 512, 256] 131072 | |
fpn/lateral_1x1_c3/b:0 [256] 256 | |
fpn/lateral_1x1_c4/W:0 [1, 1, 1024, 256] 262144 | |
fpn/lateral_1x1_c4/b:0 [256] 256 | |
fpn/lateral_1x1_c5/W:0 [1, 1, 2048, 256] 524288 | |
fpn/lateral_1x1_c5/b:0 [256] 256 | |
fpn/posthoc_3x3_p2/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p2/b:0 [256] 256 | |
fpn/posthoc_3x3_p3/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p3/b:0 [256] 256 | |
fpn/posthoc_3x3_p4/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p4/b:0 [256] 256 | |
fpn/posthoc_3x3_p5/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p5/b:0 [256] 256 | |
rpn/conv0/W:0 [3, 3, 256, 256] 589824 | |
rpn/conv0/b:0 [256] 256 | |
rpn/class/W:0 [1, 1, 256, 3] 768 | |
rpn/class/b:0 [3] 3 | |
rpn/box/W:0 [1, 1, 256, 12] 3072 | |
rpn/box/b:0 [12] 12 | |
fastrcnn/fc6/W:0 [12544, 1024] 12845056 | |
fastrcnn/fc6/b:0 [1024] 1024 | |
fastrcnn/fc7/W:0 [1024, 1024] 1048576 | |
fastrcnn/fc7/b:0 [1024] 1024 | |
fastrcnn/outputs/class/W:0 [1024, 81] 82944 | |
fastrcnn/outputs/class/b:0 [81] 81 | |
fastrcnn/outputs/box/W:0 [1024, 324] 331776 | |
fastrcnn/outputs/box/b:0 [324] 324 | |
maskrcnn/fcn0/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn0/b:0 [256] 256 | |
maskrcnn/fcn1/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn1/b:0 [256] 256 | |
maskrcnn/fcn2/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn2/b:0 [256] 256 | |
maskrcnn/fcn3/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn3/b:0 [256] 256 | |
maskrcnn/deconv/W:0 [2, 2, 256, 256] 262144 | |
maskrcnn/deconv/b:0 [256] 256 | |
maskrcnn/conv/W:0 [1, 1, 256, 80] 20480 | |
maskrcnn/conv/b:0 [80] 80[36m | |
Total #vars=168, #params=44175092, size=168.51MB[0m | |
[32m[0308 18:59:08 @base.py:208][0m Setup callbacks graph ... | |
[32m[0308 18:59:08 @base.py:229][0m Creating the session ... | |
[buildtime_shape] [proposal_metrics_batch] mean_of_mean_best_iou: () | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=0: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=0: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] box_mask_for_image, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_images_row_indices, btch_idx=1: (?,) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [sample_fast_rcnn_targets_batch] single_image_ret_boxes, btch_idx=1: (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:59:09 @registry.py:125][0m fastrcnn input: [None, 256, 7, 7] | |
[32m[0308 18:59:09 @registry.py:125][0m fastrcnn/fc6 input: [None, 256, 7, 7] | |
[32m[0308 18:59:09 @registry.py:133][0m fastrcnn/fc6 output: [None, 1024] | |
[32m[0308 18:59:09 @registry.py:125][0m fastrcnn/fc7 input: [None, 1024] | |
[32m[0308 18:59:09 @registry.py:133][0m fastrcnn/fc7 output: [None, 1024] | |
[32m[0308 18:59:09 @registry.py:133][0m fastrcnn output: [None, 1024] | |
[buildtime_shape] [train.roi_heads] head_feature: (?, 1024) | |
[32m[0308 18:59:09 @registry.py:125][0m fastrcnn/outputs input: [None, 1024] | |
[32m[0308 18:59:09 @registry.py:125][0m fastrcnn/outputs/class input: [None, 1024] | |
[32m[0308 18:59:09 @registry.py:133][0m fastrcnn/outputs/class output: [None, 81] | |
[32m[0308 18:59:09 @registry.py:125][0m fastrcnn/outputs/box input: [None, 1024] | |
[32m[0308 18:59:09 @summary.py:46][0m [MovingAverageSummary] 125 operations in collection 'MOVING_SUMMARY_OPS' will be run with session hooks. | |
[32m[0308 18:59:09 @summary.py:93][0m Summarizing collection 'summaries' of size 128. | |
[32m[0308 18:59:09 @registry.py:133][0m fastrcnn/outputs/box output: [None, 324] | |
[32m[0308 18:59:09 @registry.py:133][0m fastrcnn/outputs output: [None, 81],[None, 81, 4] | |
self.training == True | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[tshape] model_box.encode_bbox_target.boxes: (?, 4) | |
[tshape] model_box.encode_bbox_target.anchors: (?, 4) | |
[buildtime_shape] [FastRCNNHeadBatch.losses] single_image_box_logits: (?, 81, 4) | |
[32m[0308 18:59:09 @base.py:242][0m Graph Finalized. | |
[32m[0308 18:59:09 @trainers.py:453][0m Rank 1 waiting for initialization broadcasting ... | |
labels Tensor("concat:0", shape=(?,), dtype=int64) | |
label_logits Tensor("concat_1:0", shape=(?, 81), dtype=float32) | |
fg_boxes Tensor("concat_2:0", shape=(?, 4), dtype=float32) | |
fg_box_logits Tensor("concat_3:0", shape=(?, 81, 4), dtype=float32) | |
[buildtime_shape] [tf_area_batch] boxes (raw): (?, 5) | |
[buildtime_shape] [tf_area_batch] boxes (processed): (?, 4) | |
[32m[0308 18:59:10 @registry.py:125][0m maskrcnn input: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:125][0m maskrcnn/fcn0 input: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:133][0m maskrcnn/fcn0 output: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:125][0m maskrcnn/fcn1 input: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:133][0m maskrcnn/fcn1 output: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:125][0m maskrcnn/fcn2 input: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:133][0m maskrcnn/fcn2 output: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:125][0m maskrcnn/fcn3 input: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:133][0m maskrcnn/fcn3 output: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:125][0m maskrcnn/deconv input: [None, 256, 14, 14] | |
[32m[0308 18:59:10 @registry.py:133][0m maskrcnn/deconv output: [None, 256, 28, 28] | |
[32m[0308 18:59:10 @registry.py:125][0m maskrcnn/conv input: [None, 256, 28, 28] | |
[32m[0308 18:59:10 @registry.py:133][0m maskrcnn/conv output: [None, 80, 28, 28] | |
[32m[0308 18:59:10 @registry.py:133][0m maskrcnn output: [None, 80, 28, 28] | |
W0308 18:59:10.170388 140223030650624 deprecation.py:506] From /home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/model_box.py:215: calling crop_and_resize_v1 (from tensorflow.python.ops.image_ops_impl) with box_ind is deprecated and will be removed in a future version. | |
Instructions for updating: | |
box_ind is deprecated, use box_indices instead | |
[buildtime_shape] [roi_heads, batch_idx 0] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [roi_heads, batch_idx 1] single_image_image_target_masks_for_fg: (?, 1, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] mask_logits: (?, 80, 28, 28) | |
[buildtime_shape] [maskrcnn_loss] fg_labels: (?,) | |
[buildtime_shape] [maskrcnn_loss] fg_target_masks: (?, 28, 28) | |
[32m[0308 18:59:10 @base.py:242][0m Graph Finalized. | |
[32m[0308 18:59:10 @trainers.py:453][0m Rank 5 waiting for initialization broadcasting ... | |
[32m[0308 18:59:10 @regularize.py:95][0m regularize_cost() found 63 variables to regularize. | |
[32m[0308 18:59:10 @regularize.py:20][0m The following tensors will be regularized: group1/block0/conv1/W:0, group1/block0/conv2/W:0, group1/block0/conv3/W:0, group1/block0/convshortcut/W:0, group1/block1/conv1/W:0, group1/block1/conv2/W:0, group1/block1/conv3/W:0, group1/block2/conv1/W:0, group1/block2/conv2/W:0, group1/block2/conv3/W:0, group1/block3/conv1/W:0, group1/block3/conv2/W:0, group1/block3/conv3/W:0, group2/block0/conv1/W:0, group2/block0/conv2/W:0, group2/block0/conv3/W:0, group2/block0/convshortcut/W:0, group2/block1/conv1/W:0, group2/block1/conv2/W:0, group2/block1/conv3/W:0, group2/block2/conv1/W:0, group2/block2/conv2/W:0, group2/block2/conv3/W:0, group2/block3/conv1/W:0, group2/block3/conv2/W:0, group2/block3/conv3/W:0, group2/block4/conv1/W:0, group2/block4/conv2/W:0, group2/block4/conv3/W:0, group2/block5/conv1/W:0, group2/block5/conv2/W:0, group2/block5/conv3/W:0, group3/block0/conv1/W:0, group3/block0/conv2/W:0, group3/block0/conv3/W:0, group3/block0/convshortcut/W:0, group3/block1/conv1/W:0, group3/block1/conv2/W:0, group3/block1/conv3/W:0, group3/block2/conv1/W:0, group3/block2/conv2/W:0, group3/block2/conv3/W:0, fpn/lateral_1x1_c2/W:0, fpn/lateral_1x1_c3/W:0, fpn/lateral_1x1_c4/W:0, fpn/lateral_1x1_c5/W:0, fpn/posthoc_3x3_p2/W:0, fpn/posthoc_3x3_p3/W:0, fpn/posthoc_3x3_p4/W:0, fpn/posthoc_3x3_p5/W:0, rpn/conv0/W:0, rpn/class/W:0, rpn/box/W:0, fastrcnn/fc6/W:0, fastrcnn/fc7/W:0, fastrcnn/outputs/class/W:0, fastrcnn/outputs/box/W:0, maskrcnn/fcn0/W:0, maskrcnn/fcn1/W:0, maskrcnn/fcn2/W:0, maskrcnn/fcn3/W:0, maskrcnn/deconv/W:0, maskrcnn/conv/W:0 | |
W0308 18:59:10.815016 140223030650624 deprecation.py:323] From /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/array_grad.py:425: to_int32 (from tensorflow.python.ops.math_ops) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Use tf.cast instead. | |
[32m[0308 18:59:11 @base.py:242][0m Graph Finalized. | |
[32m[0308 18:59:11 @trainers.py:453][0m Rank 6 waiting for initialization broadcasting ... | |
[32m[0308 18:59:12 @base.py:242][0m Graph Finalized. | |
[32m[0308 18:59:12 @trainers.py:453][0m Rank 2 waiting for initialization broadcasting ... | |
[32m[0308 18:59:12 @base.py:242][0m Graph Finalized. | |
[32m[0308 18:59:12 @trainers.py:453][0m Rank 4 waiting for initialization broadcasting ... | |
[32m[0308 18:59:14 @base.py:229][0m Creating the session ... | |
[32m[0308 18:59:15 @base.py:242][0m Graph Finalized. | |
[32m[0308 18:59:15 @trainers.py:453][0m Rank 7 waiting for initialization broadcasting ... | |
[32m[0308 18:59:16 @monitor.py:257][0m [5m[31mWRN[0m logger directory was not set. Ignore TFEventWriter. | |
[32m[0308 18:59:16 @monitor.py:298][0m [5m[31mWRN[0m logger directory was not set. Ignore JSONWriter. | |
[32m[0308 18:59:16 @model_utils.py:64][0m [36mTrainable Variables: | |
[0mname shape dim | |
------------------------------------- ------------------ -------- | |
group1/block0/conv1/W:0 [1, 1, 256, 128] 32768 | |
group1/block0/conv1/bn/gamma:0 [128] 128 | |
group1/block0/conv1/bn/beta:0 [128] 128 | |
group1/block0/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block0/conv2/bn/gamma:0 [128] 128 | |
group1/block0/conv2/bn/beta:0 [128] 128 | |
group1/block0/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block0/conv3/bn/gamma:0 [512] 512 | |
group1/block0/conv3/bn/beta:0 [512] 512 | |
group1/block0/convshortcut/W:0 [1, 1, 256, 512] 131072 | |
group1/block0/convshortcut/bn/gamma:0 [512] 512 | |
group1/block0/convshortcut/bn/beta:0 [512] 512 | |
group1/block1/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block1/conv1/bn/gamma:0 [128] 128 | |
group1/block1/conv1/bn/beta:0 [128] 128 | |
group1/block1/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block1/conv2/bn/gamma:0 [128] 128 | |
group1/block1/conv2/bn/beta:0 [128] 128 | |
group1/block1/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block1/conv3/bn/gamma:0 [512] 512 | |
group1/block1/conv3/bn/beta:0 [512] 512 | |
group1/block2/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block2/conv1/bn/gamma:0 [128] 128 | |
group1/block2/conv1/bn/beta:0 [128] 128 | |
group1/block2/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block2/conv2/bn/gamma:0 [128] 128 | |
group1/block2/conv2/bn/beta:0 [128] 128 | |
group1/block2/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block2/conv3/bn/gamma:0 [512] 512 | |
group1/block2/conv3/bn/beta:0 [512] 512 | |
group1/block3/conv1/W:0 [1, 1, 512, 128] 65536 | |
group1/block3/conv1/bn/gamma:0 [128] 128 | |
group1/block3/conv1/bn/beta:0 [128] 128 | |
group1/block3/conv2/W:0 [3, 3, 128, 128] 147456 | |
group1/block3/conv2/bn/gamma:0 [128] 128 | |
group1/block3/conv2/bn/beta:0 [128] 128 | |
group1/block3/conv3/W:0 [1, 1, 128, 512] 65536 | |
group1/block3/conv3/bn/gamma:0 [512] 512 | |
group1/block3/conv3/bn/beta:0 [512] 512 | |
group2/block0/conv1/W:0 [1, 1, 512, 256] 131072 | |
group2/block0/conv1/bn/gamma:0 [256] 256 | |
group2/block0/conv1/bn/beta:0 [256] 256 | |
group2/block0/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block0/conv2/bn/gamma:0 [256] 256 | |
group2/block0/conv2/bn/beta:0 [256] 256 | |
group2/block0/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block0/conv3/bn/gamma:0 [1024] 1024 | |
group2/block0/conv3/bn/beta:0 [1024] 1024 | |
group2/block0/convshortcut/W:0 [1, 1, 512, 1024] 524288 | |
group2/block0/convshortcut/bn/gamma:0 [1024] 1024 | |
group2/block0/convshortcut/bn/beta:0 [1024] 1024 | |
group2/block1/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block1/conv1/bn/gamma:0 [256] 256 | |
group2/block1/conv1/bn/beta:0 [256] 256 | |
group2/block1/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block1/conv2/bn/gamma:0 [256] 256 | |
group2/block1/conv2/bn/beta:0 [256] 256 | |
group2/block1/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block1/conv3/bn/gamma:0 [1024] 1024 | |
group2/block1/conv3/bn/beta:0 [1024] 1024 | |
group2/block2/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block2/conv1/bn/gamma:0 [256] 256 | |
group2/block2/conv1/bn/beta:0 [256] 256 | |
group2/block2/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block2/conv2/bn/gamma:0 [256] 256 | |
group2/block2/conv2/bn/beta:0 [256] 256 | |
group2/block2/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block2/conv3/bn/gamma:0 [1024] 1024 | |
group2/block2/conv3/bn/beta:0 [1024] 1024 | |
group2/block3/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block3/conv1/bn/gamma:0 [256] 256 | |
group2/block3/conv1/bn/beta:0 [256] 256 | |
group2/block3/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block3/conv2/bn/gamma:0 [256] 256 | |
group2/block3/conv2/bn/beta:0 [256] 256 | |
group2/block3/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block3/conv3/bn/gamma:0 [1024] 1024 | |
group2/block3/conv3/bn/beta:0 [1024] 1024 | |
group2/block4/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block4/conv1/bn/gamma:0 [256] 256 | |
group2/block4/conv1/bn/beta:0 [256] 256 | |
group2/block4/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block4/conv2/bn/gamma:0 [256] 256 | |
group2/block4/conv2/bn/beta:0 [256] 256 | |
group2/block4/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block4/conv3/bn/gamma:0 [1024] 1024 | |
group2/block4/conv3/bn/beta:0 [1024] 1024 | |
group2/block5/conv1/W:0 [1, 1, 1024, 256] 262144 | |
group2/block5/conv1/bn/gamma:0 [256] 256 | |
group2/block5/conv1/bn/beta:0 [256] 256 | |
group2/block5/conv2/W:0 [3, 3, 256, 256] 589824 | |
group2/block5/conv2/bn/gamma:0 [256] 256 | |
group2/block5/conv2/bn/beta:0 [256] 256 | |
group2/block5/conv3/W:0 [1, 1, 256, 1024] 262144 | |
group2/block5/conv3/bn/gamma:0 [1024] 1024 | |
group2/block5/conv3/bn/beta:0 [1024] 1024 | |
group3/block0/conv1/W:0 [1, 1, 1024, 512] 524288 | |
group3/block0/conv1/bn/gamma:0 [512] 512 | |
group3/block0/conv1/bn/beta:0 [512] 512 | |
group3/block0/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block0/conv2/bn/gamma:0 [512] 512 | |
group3/block0/conv2/bn/beta:0 [512] 512 | |
group3/block0/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block0/conv3/bn/gamma:0 [2048] 2048 | |
group3/block0/conv3/bn/beta:0 [2048] 2048 | |
group3/block0/convshortcut/W:0 [1, 1, 1024, 2048] 2097152 | |
group3/block0/convshortcut/bn/gamma:0 [2048] 2048 | |
group3/block0/convshortcut/bn/beta:0 [2048] 2048 | |
group3/block1/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block1/conv1/bn/gamma:0 [512] 512 | |
group3/block1/conv1/bn/beta:0 [512] 512 | |
group3/block1/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block1/conv2/bn/gamma:0 [512] 512 | |
group3/block1/conv2/bn/beta:0 [512] 512 | |
group3/block1/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block1/conv3/bn/gamma:0 [2048] 2048 | |
group3/block1/conv3/bn/beta:0 [2048] 2048 | |
group3/block2/conv1/W:0 [1, 1, 2048, 512] 1048576 | |
group3/block2/conv1/bn/gamma:0 [512] 512 | |
group3/block2/conv1/bn/beta:0 [512] 512 | |
group3/block2/conv2/W:0 [3, 3, 512, 512] 2359296 | |
group3/block2/conv2/bn/gamma:0 [512] 512 | |
group3/block2/conv2/bn/beta:0 [512] 512 | |
group3/block2/conv3/W:0 [1, 1, 512, 2048] 1048576 | |
group3/block2/conv3/bn/gamma:0 [2048] 2048 | |
group3/block2/conv3/bn/beta:0 [2048] 2048 | |
fpn/lateral_1x1_c2/W:0 [1, 1, 256, 256] 65536 | |
fpn/lateral_1x1_c2/b:0 [256] 256 | |
fpn/lateral_1x1_c3/W:0 [1, 1, 512, 256] 131072 | |
fpn/lateral_1x1_c3/b:0 [256] 256 | |
fpn/lateral_1x1_c4/W:0 [1, 1, 1024, 256] 262144 | |
fpn/lateral_1x1_c4/b:0 [256] 256 | |
fpn/lateral_1x1_c5/W:0 [1, 1, 2048, 256] 524288 | |
fpn/lateral_1x1_c5/b:0 [256] 256 | |
fpn/posthoc_3x3_p2/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p2/b:0 [256] 256 | |
fpn/posthoc_3x3_p3/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p3/b:0 [256] 256 | |
fpn/posthoc_3x3_p4/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p4/b:0 [256] 256 | |
fpn/posthoc_3x3_p5/W:0 [3, 3, 256, 256] 589824 | |
fpn/posthoc_3x3_p5/b:0 [256] 256 | |
rpn/conv0/W:0 [3, 3, 256, 256] 589824 | |
rpn/conv0/b:0 [256] 256 | |
rpn/class/W:0 [1, 1, 256, 3] 768 | |
rpn/class/b:0 [3] 3 | |
rpn/box/W:0 [1, 1, 256, 12] 3072 | |
rpn/box/b:0 [12] 12 | |
fastrcnn/fc6/W:0 [12544, 1024] 12845056 | |
fastrcnn/fc6/b:0 [1024] 1024 | |
fastrcnn/fc7/W:0 [1024, 1024] 1048576 | |
fastrcnn/fc7/b:0 [1024] 1024 | |
fastrcnn/outputs/class/W:0 [1024, 81] 82944 | |
fastrcnn/outputs/class/b:0 [81] 81 | |
fastrcnn/outputs/box/W:0 [1024, 324] 331776 | |
fastrcnn/outputs/box/b:0 [324] 324 | |
maskrcnn/fcn0/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn0/b:0 [256] 256 | |
maskrcnn/fcn1/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn1/b:0 [256] 256 | |
maskrcnn/fcn2/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn2/b:0 [256] 256 | |
maskrcnn/fcn3/W:0 [3, 3, 256, 256] 589824 | |
maskrcnn/fcn3/b:0 [256] 256 | |
maskrcnn/deconv/W:0 [2, 2, 256, 256] 262144 | |
maskrcnn/deconv/b:0 [256] 256 | |
maskrcnn/conv/W:0 [1, 1, 256, 80] 20480 | |
maskrcnn/conv/b:0 [80] 80[36m | |
Total #vars=168, #params=44175092, size=168.51MB[0m | |
[32m[0308 18:59:16 @base.py:160][0m [5m[31mWRN[0m Callback PeriodicCallback-ModelSaver is chief-only, skipped. | |
[32m[0308 18:59:16 @base.py:160][0m [5m[31mWRN[0m Callback EstimatedTimeLeft is chief-only, skipped. | |
[32m[0308 18:59:16 @base.py:160][0m [5m[31mWRN[0m Callback SessionRunTimeout is chief-only, skipped. | |
[32m[0308 18:59:16 @base.py:160][0m [5m[31mWRN[0m Callback ThroughputTracker is chief-only, skipped. | |
[32m[0308 18:59:16 @base.py:160][0m [5m[31mWRN[0m Callback MovingAverageSummary is chief-only, skipped. | |
[32m[0308 18:59:16 @base.py:160][0m [5m[31mWRN[0m Callback MergeAllSummaries_RunWithOp is chief-only, skipped. | |
[32m[0308 18:59:16 @base.py:208][0m Setup callbacks graph ... | |
[32m[0308 18:59:17 @base.py:229][0m Creating the session ... | |
[32m[0308 18:59:21 @base.py:235][0m Initializing the session ... | |
[32m[0308 18:59:21 @sessinit.py:204][0m Variables to restore from dict: group3/block2/conv3/bn/mean/EMA:0, group3/block2/conv3/bn/gamma:0, group2/block3/conv2/bn/beta:0, group0/block1/conv1/W:0, group1/block0/convshortcut/bn/beta:0, group1/block1/conv1/bn/mean/EMA:0, group2/block0/conv2/bn/variance/EMA:0, group2/block5/conv2/bn/beta:0, group2/block1/conv1/bn/beta:0, group2/block0/conv2/bn/beta:0, group3/block1/conv2/bn/mean/EMA:0, group1/block2/conv1/bn/variance/EMA:0, group2/block1/conv1/W:0, group3/block0/conv3/bn/beta:0, group2/block3/conv1/bn/beta:0, group2/block5/conv2/bn/variance/EMA:0, group0/block2/conv1/bn/gamma:0, group1/block1/conv3/bn/variance/EMA:0, group0/block2/conv3/W:0, group2/block3/conv3/W:0, group2/block3/conv3/bn/mean/EMA:0, conv0/bn/beta:0, group2/block1/conv2/bn/variance/EMA:0, group0/block0/conv2/bn/gamma:0, group2/block3/conv1/bn/variance/EMA:0, group2/block1/conv3/bn/mean/EMA:0, group0/block2/conv3/bn/gamma:0, group0/block1/conv2/bn/gamma:0, group2/block2/conv2/bn/mean/EMA:0, group1/block0/convshortcut/bn/gamma:0, conv0/bn/mean/EMA:0, group0/block2/conv1/W:0, group0/block2/conv1/bn/mean/EMA:0, group2/block3/conv2/bn/variance/EMA:0, group0/block0/conv3/W:0, group1/block3/conv2/W:0, group3/block0/conv1/bn/variance/EMA:0, group3/block2/conv2/bn/beta:0, group1/block2/conv3/bn/mean/EMA:0, group2/block2/conv3/W:0, group1/block3/conv3/bn/variance/EMA:0, group0/block1/conv3/bn/variance/EMA:0, group1/block0/conv2/W:0, group1/block0/conv3/bn/variance/EMA:0, group1/block2/conv1/bn/mean/EMA:0, group2/block3/conv3/bn/beta:0, group1/block0/convshortcut/bn/mean/EMA:0, group1/block3/conv1/bn/mean/EMA:0, group0/block0/conv1/bn/mean/EMA:0, group1/block1/conv3/bn/gamma:0, group2/block5/conv2/bn/mean/EMA:0, group1/block3/conv2/bn/beta:0, group2/block0/convshortcut/bn/variance/EMA:0, group3/block0/convshortcut/bn/mean/EMA:0, group2/block1/conv2/bn/gamma:0, group3/block1/conv1/bn/mean/EMA:0, group3/block1/conv3/bn/variance/EMA:0, group3/block2/conv1/W:0, group0/block0/convshortcut/W:0, group2/block4/conv2/bn/beta:0, group1/block1/conv3/W:0, group3/block1/conv3/bn/beta:0, group2/block3/conv2/bn/gamma:0, group1/block2/conv3/bn/gamma:0, group0/block0/conv1/bn/gamma:0, group1/block0/conv3/bn/mean/EMA:0, group2/block1/conv3/bn/beta:0, group2/block0/conv3/bn/mean/EMA:0, group1/block3/conv2/bn/mean/EMA:0, group3/block0/conv2/W:0, group0/block2/conv3/bn/beta:0, group2/block4/conv2/bn/gamma:0, group3/block0/conv1/bn/beta:0, group1/block1/conv2/bn/variance/EMA:0, group3/block2/conv1/bn/gamma:0, group2/block0/conv2/bn/gamma:0, group0/block0/conv1/bn/variance/EMA:0, group2/block0/conv3/bn/variance/EMA:0, group3/block1/conv2/bn/beta:0, group3/block0/conv1/bn/gamma:0, group3/block0/conv2/bn/beta:0, group1/block1/conv2/bn/mean/EMA:0, group1/block0/conv1/bn/variance/EMA:0, group1/block2/conv1/bn/gamma:0, group2/block2/conv2/bn/gamma:0, group1/block0/conv2/bn/mean/EMA:0, group1/block2/conv2/bn/variance/EMA:0, group2/block0/convshortcut/bn/beta:0, group2/block4/conv3/bn/beta:0, group0/block1/conv1/bn/mean/EMA:0, group0/block0/conv3/bn/mean/EMA:0, group3/block2/conv2/bn/variance/EMA:0, group0/block1/conv3/W:0, group3/block1/conv1/bn/variance/EMA:0, group2/block0/conv1/bn/mean/EMA:0, group3/block1/conv1/bn/gamma:0, group2/block2/conv1/W:0, group2/block1/conv2/bn/beta:0, group3/block1/conv2/bn/variance/EMA:0, group1/block0/conv3/W:0, group2/block2/conv3/bn/beta:0, group3/block0/convshortcut/W:0, group2/block0/conv1/bn/gamma:0, group3/block0/convshortcut/bn/variance/EMA:0, group1/block0/convshortcut/W:0, group2/block0/conv2/bn/mean/EMA:0, group0/block1/conv1/bn/variance/EMA:0, group3/block2/conv1/bn/beta:0, group1/block3/conv1/bn/variance/EMA:0, group2/block5/conv2/W:0, group3/block2/conv1/bn/mean/EMA:0, group1/block2/conv2/bn/mean/EMA:0, group2/block0/conv1/bn/variance/EMA:0, group3/block1/conv1/bn/beta:0, group2/block0/conv3/bn/gamma:0, group2/block5/conv1/bn/variance/EMA:0, group3/block0/conv2/bn/gamma:0, group2/block4/conv3/bn/mean/EMA:0, group0/block0/convshortcut/bn/mean/EMA:0, group1/block1/conv3/bn/beta:0, group0/block2/conv1/bn/variance/EMA:0, group2/block0/convshortcut/bn/mean/EMA:0, group2/block5/conv3/W:0, group1/block0/conv2/bn/gamma:0, group0/block0/conv2/W:0, group3/block0/conv3/W:0, group1/block3/conv2/bn/gamma:0, group1/block3/conv1/W:0, group0/block0/conv3/bn/beta:0, group2/block1/conv3/bn/variance/EMA:0, group1/block2/conv2/bn/beta:0, group2/block3/conv1/W:0, group0/block1/conv3/bn/gamma:0, group3/block0/conv1/W:0, group1/block0/conv3/bn/beta:0, group0/block1/conv2/bn/variance/EMA:0, conv0/W:0, group0/block2/conv3/bn/variance/EMA:0, group2/block4/conv1/bn/variance/EMA:0, conv0/bn/gamma:0, group2/block2/conv1/bn/variance/EMA:0, group2/block3/conv3/bn/variance/EMA:0, group1/block2/conv3/W:0, group1/block0/conv1/W:0, group2/block5/conv1/bn/mean/EMA:0, group3/block0/conv3/bn/gamma:0, group0/block0/conv1/bn/beta:0, group0/block1/conv2/W:0, group3/block1/conv3/bn/gamma:0, group3/block2/conv3/bn/beta:0, group2/block5/conv3/bn/mean/EMA:0, group2/block0/conv3/bn/beta:0, group2/block2/conv2/W:0, group3/block1/conv2/bn/gamma:0, group0/block2/conv2/bn/mean/EMA:0, group2/block0/conv1/bn/beta:0, group1/block1/conv1/bn/variance/EMA:0, conv0/bn/variance/EMA:0, group3/block0/conv3/bn/mean/EMA:0, group3/block2/conv2/bn/mean/EMA:0, group2/block4/conv1/bn/gamma:0, group3/block0/convshortcut/bn/beta:0, group3/block2/conv2/bn/gamma:0, group3/block2/conv3/bn/variance/EMA:0, group3/block2/conv1/bn/variance/EMA:0, group0/block2/conv2/bn/gamma:0, group1/block2/conv1/bn/beta:0, group0/block1/conv1/bn/beta:0, group2/block5/conv3/bn/variance/EMA:0, group2/block4/conv1/bn/beta:0, group0/block0/conv2/bn/variance/EMA:0, group1/block3/conv3/bn/mean/EMA:0, group0/block0/conv3/bn/variance/EMA:0, group2/block5/conv1/bn/gamma:0, group1/block1/conv2/W:0, group2/block1/conv2/W:0, group1/block3/conv3/bn/gamma:0, group3/block0/conv2/bn/variance/EMA:0, group0/block0/conv2/bn/beta:0, group0/block2/conv3/bn/mean/EMA:0, group2/block4/conv2/W:0, group2/block1/conv2/bn/mean/EMA:0, group2/block2/conv2/bn/beta:0, group0/block0/conv1/W:0, group2/block4/conv3/W:0, group1/block3/conv2/bn/variance/EMA:0, group2/block2/conv3/bn/mean/EMA:0, group1/block3/conv3/bn/beta:0, group2/block1/conv1/bn/variance/EMA:0, group2/block3/conv2/W:0, group0/block1/conv3/bn/mean/EMA:0, group2/block2/conv1/bn/beta:0, group2/block2/conv2/bn/variance/EMA:0, group2/block0/convshortcut/bn/gamma:0, group0/block2/conv2/W:0, group0/block2/conv2/bn/beta:0, group2/block3/conv3/bn/gamma:0, group1/block0/conv1/bn/mean/EMA:0, group1/block2/conv3/bn/variance/EMA:0, group2/block0/conv2/W:0, group2/block2/conv3/bn/variance/EMA:0, group0/block0/conv2/bn/mean/EMA:0, group2/block0/conv3/W:0, group2/block4/conv1/bn/mean/EMA:0, group1/block3/conv1/bn/gamma:0, group3/block0/conv1/bn/mean/EMA:0, group1/block0/conv3/bn/gamma:0, group0/block2/conv1/bn/beta:0, group1/block2/conv2/W:0, group2/block5/conv1/W:0, group3/block1/conv1/W:0, group2/block3/conv1/bn/gamma:0, group0/block1/conv2/bn/beta:0, group3/block1/conv2/W:0, group2/block3/conv2/bn/mean/EMA:0, group0/block1/conv3/bn/beta:0, group2/block1/conv3/W:0, group2/block5/conv3/bn/gamma:0, group1/block0/conv2/bn/variance/EMA:0, group1/block1/conv3/bn/mean/EMA:0, group2/block4/conv2/bn/mean/EMA:0, group1/block1/conv1/bn/gamma:0, group2/block0/convshortcut/W:0, group2/block1/conv1/bn/gamma:0, group2/block2/conv1/bn/gamma:0, group3/block1/conv3/W:0, group2/block3/conv1/bn/mean/EMA:0, group0/block0/convshortcut/bn/beta:0, group2/block1/conv1/bn/mean/EMA:0, group2/block2/conv3/bn/gamma:0, group0/block1/conv2/bn/mean/EMA:0, group0/block0/convshortcut/bn/gamma:0, group3/block0/conv3/bn/variance/EMA:0, group2/block4/conv3/bn/variance/EMA:0, group2/block1/conv3/bn/gamma:0, group2/block4/conv3/bn/gamma:0, group1/block1/conv1/W:0, group1/block0/conv2/bn/beta:0, group0/block0/convshortcut/bn/variance/EMA:0, group1/block3/conv3/W:0, group2/block5/conv3/bn/beta:0, group1/block2/conv2/bn/gamma:0, group3/block1/conv3/bn/mean/EMA:0, group1/block1/conv1/bn/beta:0, group1/block2/conv3/bn/beta:0, group2/block5/conv2/bn/gamma:0, group3/block2/conv3/W:0, group2/block4/conv1/W:0, group2/block0/conv1/W:0, group1/block0/conv1/bn/gamma:0, group2/block5/conv1/bn/beta:0, group3/block0/conv2/bn/mean/EMA:0, group1/block0/convshortcut/bn/variance/EMA:0, group2/block4/conv2/bn/variance/EMA:0, group2/block2/conv1/bn/mean/EMA:0, group0/block2/conv2/bn/variance/EMA:0, group0/block1/conv1/bn/gamma:0, group1/block0/conv1/bn/beta:0, group0/block0/conv3/bn/gamma:0, group1/block1/conv2/bn/beta:0, group1/block1/conv2/bn/gamma:0, group3/block0/convshortcut/bn/gamma:0, group1/block3/conv1/bn/beta:0, group1/block2/conv1/W:0, group3/block2/conv2/W:0 | |
[32m[0308 18:59:21 @sessinit.py:87][0m [5m[31mWRN[0m The following variables are in the graph, but not found in the dict: fastrcnn/fc6/W, fastrcnn/fc6/b, fastrcnn/fc7/W, fastrcnn/fc7/b, fastrcnn/outputs/box/W, fastrcnn/outputs/box/b, fastrcnn/outputs/class/W, fastrcnn/outputs/class/b, fpn/lateral_1x1_c2/W, fpn/lateral_1x1_c2/b, fpn/lateral_1x1_c3/W, fpn/lateral_1x1_c3/b, fpn/lateral_1x1_c4/W, fpn/lateral_1x1_c4/b, fpn/lateral_1x1_c5/W, fpn/lateral_1x1_c5/b, fpn/posthoc_3x3_p2/W, fpn/posthoc_3x3_p2/b, fpn/posthoc_3x3_p3/W, fpn/posthoc_3x3_p3/b, fpn/posthoc_3x3_p4/W, fpn/posthoc_3x3_p4/b, fpn/posthoc_3x3_p5/W, fpn/posthoc_3x3_p5/b, global_step, learning_rate, maskrcnn/conv/W, maskrcnn/conv/b, maskrcnn/deconv/W, maskrcnn/deconv/b, maskrcnn/fcn0/W, maskrcnn/fcn0/b, maskrcnn/fcn1/W, maskrcnn/fcn1/b, maskrcnn/fcn2/W, maskrcnn/fcn2/b, maskrcnn/fcn3/W, maskrcnn/fcn3/b, rpn/box/W, rpn/box/b, rpn/class/W, rpn/class/b, rpn/conv0/W, rpn/conv0/b | |
[32m[0308 18:59:21 @sessinit.py:87][0m [5m[31mWRN[0m The following variables are in the dict, but not found in the graph: linear/W, linear/b | |
[32m[0308 18:59:21 @sessinit.py:217][0m Restoring 265 variables from dict ... | |
/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py:94: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory. | |
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. " | |
W0308 18:59:21.592200 139697895950080 deprecation.py:323] From /home/ubuntu/tensorpack-mask-rcnn/tensorpack/tfutils/varmanip.py:106: Variable.load (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Prefer Variable.assign which has equivalent behavior in 2.X. | |
[32m[0308 18:59:24 @base.py:242][0m Graph Finalized. | |
[32m[0308 18:59:24 @trainers.py:453][0m Rank 3 waiting for initialization broadcasting ... | |
[32m[0308 18:59:34 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block0/convshortcut/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 18:59:34 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block1/conv2/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 18:59:41 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block2/conv3/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 18:59:46 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block0/conv2/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 18:59:46 @varmanip.py:102][0m [5m[31mWRN[0m Variable conv0/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 18:59:48 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block0/conv1/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 18:59:53 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block2/conv1/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 19:00:03 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block0/conv3/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 19:00:16 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block1/conv3/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 19:00:20 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block2/conv2/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 19:00:40 @varmanip.py:102][0m [5m[31mWRN[0m Variable group0/block1/conv1/W has dtype <dtype: 'float16'> but was given a value of dtype float32. Load it after downcasting! | |
[32m[0308 19:01:01 @base.py:242][0m Graph Finalized. | |
[32m[0308 19:01:01 @trainers.py:451][0m Broadcasting initialized variables ... | |
WARNING: One or more tensors were submitted to be reduced, gathered or broadcasted by subset of ranks and are waiting for remainder of ranks for more than 60 seconds. This may indicate that different ranks are trying to submit different tensors or that only subset of ranks is submitting tensors, which will cause deadlock. | |
Stalled ops: | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_pos_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_pos_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_box_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_valid_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_pos_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_pos_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_pos_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_box_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_box_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_valid_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_valid_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_valid_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_pos_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_pos_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_pos_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_box_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_box_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_valid_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_pos_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_pos_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_pos_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_box_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_valid_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_valid_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_valid_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_pos_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_pos_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level4_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c5_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p3_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p4_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p5_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_conv0_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_outputs_class_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_convshortcut_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_fc6_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_box_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_fc7_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_metrics_fg_accuracy_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_deconv_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_global_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_convshortcut_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_box_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p2_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_QueueInput_queue_size_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p5_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_box_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_valid_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_fc6_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_convshortcut_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_maskrcnn_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_class_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_valid_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn0_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_convshortcut_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_convshortcut_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_convshortcut_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_fc7_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_convshortcut_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c4_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_box_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn1_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_conv0_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn3_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_convshortcut_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_outputs_box_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_convshortcut_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_valid_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_pos_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p5_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_convshortcut_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_pos_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_sample_fast_rcnn_targets_batch_proposal_metrics_batch_recall_iou0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_metrics_false_negative_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p4_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_valid_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_pos_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c3_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_outputs_box_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_valid_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p2_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p3_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_pos_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_valid_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_fc7_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_box_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_box_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p4_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_conv0_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_pos_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_box_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_convshortcut_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_box_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_fc6_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_label_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_valid_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p4_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_box_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn2_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_pos_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_pos_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_convshortcut_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_accuracy_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_sample_fast_rcnn_targets_batch_proposal_metrics_batch_recall_iou0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c4_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_convshortcut_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_convshortcut_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_conv0_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_Identity_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_box_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_box_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_deconv_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_valid_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_pos_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_box_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_class_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_pos_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_outputs_class_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn2_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_label_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_box_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_convshortcut_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c2_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_fg_pixel_ratio_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c5_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p5_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_QueueInput_queue_size_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_class_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_valid_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_conv_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_valid_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_metrics_fg_accuracy_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_QueueInput_queue_size_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_convshortcut_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level3_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c3_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c4_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_conv0_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_convshortcut_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_convshortcut_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_box_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_box_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_convshortcut_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_convshortcut_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_box_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c4_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_valid_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_valid_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_convshortcut_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_accuracy_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_convshortcut_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_valid_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_rpn_class_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_num_fg_label_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_valid_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_num_fg_label_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_wd_cost_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn1_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_pos_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_convshortcut_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_num_valid_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_box_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_convshortcut_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_learning_rate_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_convshortcut_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_maskrcnn_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_convshortcut_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_wd_cost_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_conv1_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_pos_accuracy_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_outputs_class_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv2_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_metrics_fg_accuracy_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_metrics_accuracy_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_conv0_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_box_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_accuracy_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_outputs_class_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv3_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_conv_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block1_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block4_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_box_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_pos_accuracy_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn3_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_fg_pixel_ratio_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_metrics_accuracy_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level4_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_fc7_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level3_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_num_fg_label_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_convshortcut_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv2_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_conv_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn0_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c2_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_deconv_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_outputs_box_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_box_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv2_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv2_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block5_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn0_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn0_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_Identity_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block3_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_box_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level4_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block1_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_precision_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_valid_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_Identity_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv3_bn_variance_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv1_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_box_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level3_label_metrics_1_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level4_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_mask_fpn_map_rois_to_levels_batch_num_roi_level3_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_metrics_false_negative_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_precision_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_maskrcnn_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_metrics_accuracy_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_fg_pixel_ratio_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_wd_cost_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_fastrcnn_losses_label_metrics_false_negative_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block0_conv3_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_num_valid_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_maskrcnn_loss_pos_accuracy_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv1_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block2_conv1_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level4_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv1_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_sample_fast_rcnn_targets_batch_proposal_metrics_batch_recall_iou0_3_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_precision_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_fc6_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_posthoc_3x3_p2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_precision_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level4_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level3_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level3_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_sample_fast_rcnn_targets_batch_proposal_metrics_batch_recall_iou0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_sample_fast_rcnn_targets_batch_proposal_metrics_batch_recall_iou0_3_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_sample_fast_rcnn_targets_batch_proposal_metrics_batch_recall_iou0_3_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_conv0_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_box_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv1_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_sample_fast_rcnn_targets_batch_proposal_metrics_batch_best_iou_per_gt_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_sample_fast_rcnn_targets_batch_proposal_metrics_batch_best_iou_per_gt_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_sample_fast_rcnn_targets_batch_proposal_metrics_batch_best_iou_per_gt_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block0_convshortcut_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv3_bn_beta_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_box_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv1_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_box_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group2_block2_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_label_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_pos_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_box_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_box_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block1_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_conv_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_deconv_b_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_precision_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_pos_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_label_metrics_1_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_valid_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_pos_anchor_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_box_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_pos_anchor_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv3_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_recall_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_pos_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_multilevel_roi_align_batch_fpn_map_rois_to_levels_batch_num_roi_level3_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_pos_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_valid_anchor_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_conv0_bn_gamma_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_box_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_box_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_conv0_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_label_metrics_1_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_recall_th0_2_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block3_conv3_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_precision_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_1_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv2_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv2_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_pos_anchor_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_num_valid_anchor_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_box_loss_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_recall_th0_5_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_box_loss_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level2_box_loss_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block1_conv3_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_loss_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fastrcnn_outputs_box_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_precision_th0_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_recall_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c5_b_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_fpn_lateral_1x1_c5_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level6_num_valid_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv2_W_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_recall_th0_2_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_precision_th0_2_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block2_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_recall_th0_5_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_recall_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level5_label_metrics_precision_th0_5_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_maskrcnn_fcn2_W_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_num_valid_anchor_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group3_block0_convshortcut_bn_gamma_Momentum_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_box_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_loss_1_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_loss_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_recall_th0_1_local_step_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group1_block0_conv3_bn_beta_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_EMA_rpn_losses_batch_level4_label_metrics_1_recall_th0_1_biased_0 [missing ranks: 0] | |
horovod_broadcast/HorovodBroadcast_group0_block2_conv1_bn_mean_EMA_0 [missing ranks: 0] | |
[32m[0308 19:01:04 @concurrency.py:38][0m Starting EnqueueThread QueueInput/input_queue ... | |
[32m[0308 19:01:04 @concurrency.py:38][0m Starting EnqueueThread QueueInput/input_queue ... | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.003300 | |
/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py:94: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory. | |
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. " | |
W0308 19:01:04.403121 139695895922432 deprecation.py:323] From /home/ubuntu/tensorpack-mask-rcnn/tensorpack/callbacks/param.py:79: Variable.load (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Prefer Variable.assign which has equivalent behavior in 2.X. | |
[32m[0308 19:01:04 @concurrency.py:38][0m Starting EnqueueThread QueueInput/input_queue ... | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.003300 | |
/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py:94: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory. | |
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. " | |
W0308 19:01:04.417641 140655740700416 deprecation.py:323] From /home/ubuntu/tensorpack-mask-rcnn/tensorpack/callbacks/param.py:79: Variable.load (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Prefer Variable.assign which has equivalent behavior in 2.X. | |
[32m[0308 19:01:04 @concurrency.py:38][0m Starting EnqueueThread QueueInput/input_queue ... | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.003300 | |
/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py:94: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory. | |
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. " | |
W0308 19:01:04.419090 140085798336256 deprecation.py:323] From /home/ubuntu/tensorpack-mask-rcnn/tensorpack/callbacks/param.py:79: Variable.load (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Prefer Variable.assign which has equivalent behavior in 2.X. | |
[32m[0308 19:01:04 @concurrency.py:38][0m Starting EnqueueThread QueueInput/input_queue ... | |
[32m[0308 19:01:04 @concurrency.py:38][0m Starting EnqueueThread QueueInput/input_queue ... | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.003300 | |
/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py:94: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory. | |
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. " | |
W0308 19:01:04.462747 139814253213440 deprecation.py:323] From /home/ubuntu/tensorpack-mask-rcnn/tensorpack/callbacks/param.py:79: Variable.load (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Prefer Variable.assign which has equivalent behavior in 2.X. | |
[32m[0308 19:01:04 @concurrency.py:38][0m Starting EnqueueThread QueueInput/input_queue ... | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.003300 | |
/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py:94: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory. | |
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. " | |
W0308 19:01:04.466423 140223030650624 deprecation.py:323] From /home/ubuntu/tensorpack-mask-rcnn/tensorpack/callbacks/param.py:79: Variable.load (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Prefer Variable.assign which has equivalent behavior in 2.X. | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.003300 | |
/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py:94: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory. | |
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. " | |
W0308 19:01:04.466780 140106948318976 deprecation.py:323] From /home/ubuntu/tensorpack-mask-rcnn/tensorpack/callbacks/param.py:79: Variable.load (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Prefer Variable.assign which has equivalent behavior in 2.X. | |
[32m[0308 19:01:04 @concurrency.py:38][0m Starting EnqueueThread QueueInput/input_queue ... | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.003300 | |
/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gradients_util.py:94: UserWarning: Converting sparse IndexedSlices to a dense Tensor of unknown shape. This may consume a large amount of memory. | |
"Converting sparse IndexedSlices to a dense Tensor of unknown shape. " | |
W0308 19:01:04.526553 140134360606464 deprecation.py:323] From /home/ubuntu/tensorpack-mask-rcnn/tensorpack/callbacks/param.py:79: Variable.load (from tensorflow.python.ops.variables) is deprecated and will be removed in a future version. | |
Instructions for updating: | |
Prefer Variable.assign which has equivalent behavior in 2.X. | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.010000 | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.010000 | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.010000 | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.010000 | |
[32m[0308 19:01:04 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.010000 | |
[32m[0308 19:01:05 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.010000 | |
[32m[0308 19:01:05 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.010000 | |
[32m[0308 19:01:05 @base.py:274][0m Start Epoch 1 ... | |
0%| |0/15000[00:00<?,?it/s][32m[0308 19:01:05 @base.py:274][0m Start Epoch 1 ... | |
0%| |0/15000[00:00<?,?it/s][32m[0308 19:01:05 @base.py:274][0m Start Epoch 1 ... | |
0%| |0/15000[00:00<?,?it/s][32m[0308 19:01:05 @base.py:274][0m Start Epoch 1 ... | |
0%| |0/15000[00:00<?,?it/s][32m[0308 19:01:05 @base.py:274][0m Start Epoch 1 ... | |
0%| |0/15000[00:00<?,?it/s][32m[0308 19:01:05 @base.py:274][0m Start Epoch 1 ... | |
0%| |0/15000[00:00<?,?it/s][32m[0308 19:01:05 @base.py:274][0m Start Epoch 1 ... | |
0%| |0/15000[00:00<?,?it/s][32m[0308 19:01:07 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.003300 | |
[32m[0308 19:01:07 @param.py:158][0m [HyperParamSetter] At global_step=0, learning_rate is set to 0.010000 | |
[32m[0308 19:01:07 @base.py:274][0m Start Epoch 1 ... | |
0%| |0/15000[00:00<?,?it/s]ip-172-31-14-112:20669:20797 [0] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO NET/IB : Using interface ens5 for sideband communication | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Using internal Network Socket | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO NET/Socket : 1 interfaces found | |
NCCL version 2.3.7+cuda10.0 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO rank 0 nranks 8 | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO NET/IB : Using interface ens5 for sideband communication | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Using internal Network Socket | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO rank 4 nranks 8 | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO NET/IB : Using interface ens5 for sideband communication | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Using internal Network Socket | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO rank 1 nranks 8 | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO NET/IB : Using interface ens5 for sideband communication | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO NET/IB : Using interface ens5 for sideband communication | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Using internal Network Socket | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO rank 2 nranks 8 | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO NET/IB : Using interface ens5 for sideband communication | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO NET/IB : Using interface ens5 for sideband communication | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Using internal Network Socket | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO rank 6 nranks 8 | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Using internal Network Socket | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO rank 3 nranks 8 | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO NET/IB : Using interface ens5 for sideband communication | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Using internal Network Socket | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO rank 5 nranks 8 | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Using internal Network Socket | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO rank 7 nranks 8 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO comm 0x7f0d683c5f30 rank 0 nranks 8 | |
[runtime_tensor] [train.py] total_cost 7.74276972 | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO comm 0x7f6ca43aa4d0 rank 4 nranks 8 | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO NET/Socket : 1 interfaces found | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO comm 0x7fec6c3c62c0 rank 1 nranks 8 | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO NET/Socket : 1 interfaces found | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO comm 0x7f87ac3f41a0 rank 3 nranks 8 | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO NET/Socket : 1 interfaces found | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO comm 0x7f0cf03ae500 rank 5 nranks 8 | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO NET/Socket : 1 interfaces found | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO comm 0x7f73083d7580 rank 2 nranks 8 | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO NET/Socket : 1 interfaces found | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO comm 0x7f67b83ae5c0 rank 6 nranks 8 | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO comm 0x7f28803af190 rank 7 nranks 8 | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO NET/Socket : 1 interfaces found | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO NET : Using interface ens5:172.31.14.112<0> | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO NET/Socket : 1 interfaces found | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO CUDA Dev 2, IP Interfaces : ens5(PHB) | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO CUDA Dev 1, IP Interfaces : ens5(PHB) | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO CUDA Dev 3, IP Interfaces : ens5(PHB) | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO CUDA Dev 4, IP Interfaces : ens5(PHB) | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO CUDA Dev 6, IP Interfaces : ens5(PHB) | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO CUDA Dev 5, IP Interfaces : ens5(PHB) | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO CUDA Dev 7, IP Interfaces : ens5(PHB) | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO CUDA Dev 0, IP Interfaces : ens5(PHB) | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO NCCL_MIN_NRINGS set by environment to 8. | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO NCCL_MIN_NRINGS set by environment to 8. | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO NCCL_MIN_NRINGS set by environment to 8. | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO NCCL_MIN_NRINGS set by environment to 8. | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO NCCL_MIN_NRINGS set by environment to 8. | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO NCCL_MIN_NRINGS set by environment to 8. | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO NCCL_MIN_NRINGS set by environment to 8. | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO NCCL_MIN_NRINGS set by environment to 8. | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Using 256 threads | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Min Comp Cap 7 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 00 : 0 1 2 3 7 5 6 4 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 01 : 0 2 6 7 4 5 1 3 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 02 : 0 3 1 5 4 7 6 2 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 03 : 0 3 2 1 5 6 7 4 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 04 : 0 4 6 5 7 3 2 1 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 05 : 0 4 7 6 5 1 2 3 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 06 : 0 1 2 3 7 5 6 4 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 07 : 0 2 6 7 4 5 1 3 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 08 : 0 3 1 5 4 7 6 2 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 09 : 0 3 2 1 5 6 7 4 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 10 : 0 4 6 5 7 3 2 1 | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 11 : 0 4 7 6 5 1 2 3 | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 00 : 4[4] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 00 : 7[7] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 00 : 2[2] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 00 : 6[6] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 00 : 0[0] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 00 : 5[5] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 00 : 3[3] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 00 : 1[1] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 01 : 2[2] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 01 : 3[3] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 01 : 4[4] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 01 : 7[7] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 01 : 6[6] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 01 : 0[0] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 01 : 1[1] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 01 : 5[5] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 02 : 7[7] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 02 : 3[3] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 02 : 4[4] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 02 : 2[2] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 02 : 1[1] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 02 : 0[0] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 02 : 5[5] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 02 : 6[6] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 03 : 1[1] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 03 : 0[0] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 03 : 6[6] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 03 : 7[7] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 03 : 2[2] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 03 : 3[3] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 03 : 5[5] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 03 : 4[4] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 04 : 4[4] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 04 : 5[5] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 04 : 6[6] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 04 : 7[7] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 04 : 3[3] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 04 : 0[0] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 04 : 1[1] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 04 : 2[2] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 05 : 2[2] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 05 : 1[1] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 05 : 3[3] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 05 : 5[5] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 05 : 6[6] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 05 : 7[7] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 05 : 4[4] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 05 : 0[0] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 06 : 0[0] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 06 : 1[1] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 06 : 2[2] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 06 : 5[5] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 06 : 3[3] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 06 : 4[4] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 06 : 6[6] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 06 : 7[7] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 07 : 6[6] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 07 : 0[0] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 07 : 4[4] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 07 : 7[7] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 07 : 3[3] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 07 : 2[2] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 07 : 5[5] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 07 : 1[1] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 08 : 4[4] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 08 : 3[3] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 08 : 2[2] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 08 : 7[7] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 08 : 6[6] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 08 : 0[0] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 08 : 1[1] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 08 : 5[5] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 09 : 7[7] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 09 : 6[6] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 09 : 0[0] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 09 : 2[2] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 09 : 1[1] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 09 : 5[5] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 09 : 3[3] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 09 : 4[4] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 10 : 7[7] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 10 : 6[6] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 10 : 1[1] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 10 : 0[0] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 10 : 2[2] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 10 : 5[5] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 10 : 4[4] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 10 : 3[3] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO Ring 11 : 3[3] -> 0[0] via P2P/IPC | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO Ring 11 : 4[4] -> 7[7] via P2P/IPC | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO Ring 11 : 5[5] -> 1[1] via P2P/IPC | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO Ring 11 : 6[6] -> 5[5] via P2P/IPC | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO Ring 11 : 7[7] -> 6[6] via P2P/IPC | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO Ring 11 : 1[1] -> 2[2] via P2P/IPC | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO Ring 11 : 2[2] -> 3[3] via P2P/IPC | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Ring 11 : 0[0] -> 4[4] via P2P/IPC | |
ip-172-31-14-112:20672:20778 [3] NCCL INFO comm 0x7f87ac3f41a0 rank 3 nranks 8 - COMPLETE | |
ip-172-31-14-112:20676:20776 [7] NCCL INFO comm 0x7f28803af190 rank 7 nranks 8 - COMPLETE | |
ip-172-31-14-112:20673:20777 [4] NCCL INFO comm 0x7f6ca43aa4d0 rank 4 nranks 8 - COMPLETE | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO comm 0x7f0d683c5f30 rank 0 nranks 8 - COMPLETE | |
ip-172-31-14-112:20674:20783 [5] NCCL INFO comm 0x7f0cf03ae500 rank 5 nranks 8 - COMPLETE | |
ip-172-31-14-112:20675:20782 [6] NCCL INFO comm 0x7f67b83ae5c0 rank 6 nranks 8 - COMPLETE | |
ip-172-31-14-112:20670:20784 [1] NCCL INFO comm 0x7fec6c3c62c0 rank 1 nranks 8 - COMPLETE | |
ip-172-31-14-112:20671:20794 [2] NCCL INFO comm 0x7f73083d7580 rank 2 nranks 8 - COMPLETE | |
ip-172-31-14-112:20669:20797 [0] NCCL INFO Launch mode Parallel | |
[32m[0308 19:01:27 @param.py:161][0m [HyperParamSetter] At global_step=1, learning_rate changes from 0.003300 to 0.003307 | |
[32m[0308 19:01:27 @param.py:161][0m [HyperParamSetter] At global_step=1, learning_rate changes from 0.003300 to 0.003307 | |
[32m[0308 19:01:27 @param.py:161][0m [HyperParamSetter] At global_step=1, learning_rate changes from 0.003300 to 0.003307 | |
[32m[0308 19:01:27 @param.py:161][0m [HyperParamSetter] At global_step=1, learning_rate changes from 0.003300 to 0.003307 | |
[32m[0308 19:01:27 @param.py:161][0m [HyperParamSetter] At global_step=1, learning_rate changes from 0.003300 to 0.003307 | |
[32m[0308 19:01:27 @param.py:161][0m [HyperParamSetter] At global_step=1, learning_rate changes from 0.003300 to 0.003307 | |
[32m[0308 19:01:27 @param.py:161][0m [HyperParamSetter] At global_step=1, learning_rate changes from 0.003300 to 0.003307 | |
[32m[0308 19:01:27 @param.py:161][0m [HyperParamSetter] At global_step=1, learning_rate changes from 0.003300 to 0.003307 | |
[runtime_tensor] [train.py] total_cost 2.68372798 | |
[runtime_tensor] [train.py] total_cost 3.29999471 | |
[runtime_tensor] [train.py] total_cost 2.37285805 | |
[runtime_tensor] [train.py] total_cost 2.15923524 | |
[runtime_tensor] [train.py] total_cost 1.79694939 | |
[runtime_tensor] [train.py] total_cost 1.95457268 | |
[runtime_tensor] [train.py] total_cost 1.6180402 | |
[runtime_tensor] [train.py] total_cost 1.8047204 | |
[runtime_tensor] [train.py] total_cost 2.04750919 | |
[runtime_tensor] [train.py] total_cost 1.98510289 | |
[runtime_tensor] [train.py] total_cost 2.89567876 | |
[runtime_tensor] [train.py] total_cost 1.85854411 | |
[runtime_tensor] [train.py] total_cost 2.44396544 | |
[runtime_tensor] [train.py] total_cost 1.78461266 | |
[runtime_tensor] [train.py] total_cost 1.38053155 | |
[runtime_tensor] [train.py] total_cost 2.01208425 | |
[runtime_tensor] [train.py] total_cost 1.5321486 | |
[runtime_tensor] [train.py] total_cost 1.57560372 | |
[runtime_tensor] [train.py] total_cost 2.14943457 | |
[runtime_tensor] [train.py] total_cost 1.97888362 | |
[runtime_tensor] [train.py] total_cost 1.65204453 | |
[runtime_tensor] [train.py] total_cost 1.56024027 | |
[runtime_tensor] [train.py] total_cost 1.87394798 | |
[runtime_tensor] [train.py] total_cost 2.14857912 | |
[runtime_tensor] [train.py] total_cost 2.6338563 | |
[runtime_tensor] [train.py] total_cost 1.66367006 | |
[runtime_tensor] [train.py] total_cost 1.69520748 | |
[runtime_tensor] [train.py] total_cost 1.51978779 | |
[runtime_tensor] [train.py] total_cost 1.52646828 | |
[runtime_tensor] [train.py] total_cost 1.65277588 | |
[runtime_tensor] [train.py] total_cost 2.66609097 | |
[runtime_tensor] [train.py] total_cost 2.10883 | |
[runtime_tensor] [train.py] total_cost 1.48919618 | |
[runtime_tensor] [train.py] total_cost 1.50616884 | |
[runtime_tensor] [train.py] total_cost 1.49903727 | |
[runtime_tensor] [train.py] total_cost 1.69256687 | |
[runtime_tensor] [train.py] total_cost 1.96962976 | |
[runtime_tensor] [train.py] total_cost 1.7483077 | |
[runtime_tensor] [train.py] total_cost 1.7994194 | |
[runtime_tensor] [train.py] total_cost 1.51916134 | |
[runtime_tensor] [train.py] total_cost 1.50133681 | |
[runtime_tensor] [train.py] total_cost 1.60936475 | |
[runtime_tensor] [train.py] total_cost 1.39384103 | |
[runtime_tensor] [train.py] total_cost 4.32952309 | |
[runtime_tensor] [train.py] total_cost 2.29733944 | |
[runtime_tensor] [train.py] total_cost 2.75384903 | |
[runtime_tensor] [train.py] total_cost 1.92965019 | |
[runtime_tensor] [train.py] total_cost 2.23456717 | |
[runtime_tensor] [train.py] total_cost 1.83206916 | |
[runtime_tensor] [train.py] total_cost 2.06228161 | |
[runtime_tensor] [train.py] total_cost 1.77524638 | |
[runtime_tensor] [train.py] total_cost 2.49890494 | |
[runtime_tensor] [train.py] total_cost 1.82528448 | |
[runtime_tensor] [train.py] total_cost 2.19772911 | |
[runtime_tensor] [train.py] total_cost 1.49935699 | |
[runtime_tensor] [train.py] total_cost 2.32633138 | |
[runtime_tensor] [train.py] total_cost 2.20369101 | |
[runtime_tensor] [train.py] total_cost 1.68592381 | |
[runtime_tensor] [train.py] total_cost 1.70486045 | |
[runtime_tensor] [train.py] total_cost 2.13867712 | |
[runtime_tensor] [train.py] total_cost 1.70506871 | |
[runtime_tensor] [train.py] total_cost 2.68161631 | |
[runtime_tensor] [train.py] total_cost 1.49807453 | |
[runtime_tensor] [train.py] total_cost 1.6216445 | |
[runtime_tensor] [train.py] total_cost 1.4256649 | |
[runtime_tensor] [train.py] total_cost 1.62524271 | |
[runtime_tensor] [train.py] total_cost 1.51781106 | |
[runtime_tensor] [train.py] total_cost 1.49771595 | |
[runtime_tensor] [train.py] total_cost 1.6821034 | |
[runtime_tensor] [train.py] total_cost 1.51543331 | |
[runtime_tensor] [train.py] total_cost 1.75046885 | |
[runtime_tensor] [train.py] total_cost 2.09669566 | |
[runtime_tensor] [train.py] total_cost 1.54705167 | |
[runtime_tensor] [train.py] total_cost 2.07160735 | |
[runtime_tensor] [train.py] total_cost 1.81216621 | |
[runtime_tensor] [train.py] total_cost 3.31781292 | |
[runtime_tensor] [train.py] total_cost 1.65445542 | |
[runtime_tensor] [train.py] total_cost 2.82713652 | |
[runtime_tensor] [train.py] total_cost 2.55543518 | |
[runtime_tensor] [train.py] total_cost 1.65127409 | |
[runtime_tensor] [train.py] total_cost 1.86816692 | |
[runtime_tensor] [train.py] total_cost 1.98674822 | |
[runtime_tensor] [train.py] total_cost 1.8982687 | |
[runtime_tensor] [train.py] total_cost 2.18057346 | |
[runtime_tensor] [train.py] total_cost 1.66095722 | |
[runtime_tensor] [train.py] total_cost 1.46637034 | |
[runtime_tensor] [train.py] total_cost 1.43194032 | |
[runtime_tensor] [train.py] total_cost 3.3553946 | |
[runtime_tensor] [train.py] total_cost 1.47296846 | |
[runtime_tensor] [train.py] total_cost 2.14104986 | |
[runtime_tensor] [train.py] total_cost 1.2747128 | |
[runtime_tensor] [train.py] total_cost 1.44938612 | |
[runtime_tensor] [train.py] total_cost 2.49776793 | |
[runtime_tensor] [train.py] total_cost 1.5389297 | |
[runtime_tensor] [train.py] total_cost 1.80390501 | |
[runtime_tensor] [train.py] total_cost 1.86149466 | |
[runtime_tensor] [train.py] total_cost 1.52676535 | |
[runtime_tensor] [train.py] total_cost 1.80225301 | |
1%| |99/15000[01:00<2:30:57, 1.65it/s] | |
1%| |99/15000[01:00<2:30:58, 1.65it/s] | |
1%| |99/15000[01:00<2:30:45, 1.65it/s] | |
1%| |99/15000[01:00<2:31:18, 1.64it/s] | |
1%| |99/15000[01:00<2:31:13, 1.64it/s] | |
1%| |99/15000[01:00<2:31:13, 1.64it/s][runtime_tensor] [train.py] total_cost 1.56066203 | |
1%| |100/15000[01:00<2:29:18, 1.66it/s][runtime_tensor] [train.py] total_cost 3.16070533 | |
[runtime_tensor] [train.py] total_cost 1.55273366 | |
[runtime_tensor] [train.py] total_cost 2.39772511 | |
[runtime_tensor] [train.py] total_cost 1.77676845 | |
[runtime_tensor] [train.py] total_cost 1.80637848 | |
[runtime_tensor] [train.py] total_cost 1.73814237 | |
[runtime_tensor] [train.py] total_cost 1.82725048 | |
[runtime_tensor] [train.py] total_cost 1.67076957 | |
[runtime_tensor] [train.py] total_cost 1.5470376 | |
[runtime_tensor] [train.py] total_cost 2.07420778 | |
[runtime_tensor] [train.py] total_cost 2.45469379 | |
1%| |111/15000[01:00<2:14:22, 1.85it/s][runtime_tensor] [train.py] total_cost 1.61869931 | |
[runtime_tensor] [train.py] total_cost 1.47121203 | |
[runtime_tensor] [train.py] total_cost 1.80055571 | |
[runtime_tensor] [train.py] total_cost 1.51263833 | |
[runtime_tensor] [train.py] total_cost 1.99709439 | |
[runtime_tensor] [train.py] total_cost 1.80661726 | |
[runtime_tensor] [train.py] total_cost 1.53217459 | |
[runtime_tensor] [train.py] total_cost 1.87826085 | |
[runtime_tensor] [train.py] total_cost 2.2569 | |
[runtime_tensor] [train.py] total_cost 1.73720837 | |
[runtime_tensor] [train.py] total_cost 1.3138231 | |
[runtime_tensor] [train.py] total_cost 1.55934024 | |
[runtime_tensor] [train.py] total_cost 3.50098944 | |
[runtime_tensor] [train.py] total_cost 1.82770824 | |
[runtime_tensor] [train.py] total_cost 1.63414836 | |
[runtime_tensor] [train.py] total_cost 2.07247 | |
[runtime_tensor] [train.py] total_cost 2.79369879 | |
[runtime_tensor] [train.py] total_cost 1.60313714 | |
[runtime_tensor] [train.py] total_cost 1.6643703 | |
[runtime_tensor] [train.py] total_cost 1.72318578 | |
[runtime_tensor] [train.py] total_cost 1.45365429 | |
[runtime_tensor] [train.py] total_cost 1.58793759 | |
[runtime_tensor] [train.py] total_cost 1.62271953 | |
[runtime_tensor] [train.py] total_cost 1.78072798 | |
[runtime_tensor] [train.py] total_cost 1.6491847 | |
[runtime_tensor] [train.py] total_cost 1.50066388 | |
[runtime_tensor] [train.py] total_cost 1.76127195 | |
[runtime_tensor] [train.py] total_cost 2.9022522 | |
[runtime_tensor] [train.py] total_cost 1.80746269 | |
[runtime_tensor] [train.py] total_cost 1.81901264 | |
[runtime_tensor] [train.py] total_cost 1.92828906 | |
[runtime_tensor] [train.py] total_cost 1.69186568 | |
[runtime_tensor] [train.py] total_cost 1.87945437 | |
[runtime_tensor] [train.py] total_cost 1.77583015 | |
[runtime_tensor] [train.py] total_cost 1.73773742 | |
[runtime_tensor] [train.py] total_cost 1.65560842 | |
[runtime_tensor] [train.py] total_cost 1.37663341 | |
[runtime_tensor] [train.py] total_cost 1.37293553 | |
[runtime_tensor] [train.py] total_cost 2.37522125 | |
[runtime_tensor] [train.py] total_cost 1.45609069 | |
[runtime_tensor] [train.py] total_cost 8.68362331 | |
[runtime_tensor] [train.py] total_cost 1.42834663 | |
[runtime_tensor] [train.py] total_cost 1.98457 | |
[runtime_tensor] [train.py] total_cost 2.56650066 | |
[runtime_tensor] [train.py] total_cost 1.96816754 | |
[runtime_tensor] [train.py] total_cost 2.35487986 | |
[runtime_tensor] [train.py] total_cost 1.68015075 | |
[runtime_tensor] [train.py] total_cost 2.64335108 | |
[runtime_tensor] [train.py] total_cost 2.05562544 | |
[runtime_tensor] [train.py] total_cost 2.28359556 | |
[runtime_tensor] [train.py] total_cost 1.60458016 | |
[runtime_tensor] [train.py] total_cost 1.62639725 | |
[runtime_tensor] [train.py] total_cost 3.04388165 | |
[runtime_tensor] [train.py] total_cost 2.22289658 | |
[runtime_tensor] [train.py] total_cost 1.56351399 | |
[runtime_tensor] [train.py] total_cost 1.52321053 | |
[runtime_tensor] [train.py] total_cost 1.73867118 | |
[runtime_tensor] [train.py] total_cost 1.46658468 | |
[runtime_tensor] [train.py] total_cost 3.40237284 | |
[runtime_tensor] [train.py] total_cost 1.92706013 | |
[runtime_tensor] [train.py] total_cost 3.33335638 | |
[runtime_tensor] [train.py] total_cost 2.82642603 | |
[runtime_tensor] [train.py] total_cost 2.48535872 | |
[runtime_tensor] [train.py] total_cost 1.65020025 | |
[runtime_tensor] [train.py] total_cost 1.8403616 | |
[runtime_tensor] [train.py] total_cost 1.76992881 | |
[runtime_tensor] [train.py] total_cost 1.70384812 | |
[runtime_tensor] [train.py] total_cost 1.49547684 | |
[runtime_tensor] [train.py] total_cost 3.52192569 | |
[runtime_tensor] [train.py] total_cost 1.84280682 | |
[runtime_tensor] [train.py] total_cost 1.95146108 | |
[runtime_tensor] [train.py] total_cost 2.20358682 | |
[runtime_tensor] [train.py] total_cost 2.44687843 | |
[runtime_tensor] [train.py] total_cost 2.1141367 | |
[runtime_tensor] [train.py] total_cost 1.78464508 | |
[runtime_tensor] [train.py] total_cost 1.78705359 | |
[runtime_tensor] [train.py] total_cost 1.43794358 | |
[runtime_tensor] [train.py] total_cost 1.41164923 | |
[runtime_tensor] [train.py] total_cost 1.69273233 | |
[runtime_tensor] [train.py] total_cost 1.87850595 | |
[runtime_tensor] [train.py] total_cost 1.74243188 | |
[runtime_tensor] [train.py] total_cost 1.76256025 | |
[runtime_tensor] [train.py] total_cost 1.60745418 | |
[runtime_tensor] [train.py] total_cost 1.24730587 | |
[runtime_tensor] [train.py] total_cost 1.69991541 | |
1%|1 |195/15000[01:20<2:30:20, 1.64it/s] | |
1%|1 |196/15000[01:20<2:30:14, 1.64it/s] | |
1%|1 |196/15000[01:20<2:30:14, 1.64it/s][runtime_tensor] [train.py] total_cost 1.83049726 | |
1%|1 |196/15000[01:20<2:29:59, 1.65it/s] | |
1%|1 |196/15000[01:20<2:29:58, 1.65it/s] | |
1%|1 |197/15000[01:20<2:29:45, 1.65it/s][runtime_tensor] [train.py] total_cost 1.5638783 | |
1%|1 |198/15000[01:20<2:28:19, 1.66it/s][runtime_tensor] [train.py] total_cost 1.926512 | |
[runtime_tensor] [train.py] total_cost 1.86500323 | |
[runtime_tensor] [train.py] total_cost 1.62726 | |
[runtime_tensor] [train.py] total_cost 1.66358805 | |
[runtime_tensor] [train.py] total_cost 1.85985184 | |
[runtime_tensor] [train.py] total_cost 1.66700244 | |
[runtime_tensor] [train.py] total_cost 1.71822977 | |
[runtime_tensor] [train.py] total_cost 1.9840827 | |
[runtime_tensor] [train.py] total_cost 1.64785361 | |
[runtime_tensor] [train.py] total_cost 1.77589238 | |
[runtime_tensor] [train.py] total_cost 1.73721886 | |
1%|1 |209/15000[01:20<2:13:29, 1.85it/s][runtime_tensor] [train.py] total_cost 1.43297899 | |
[runtime_tensor] [train.py] total_cost 2.76672029 | |
[runtime_tensor] [train.py] total_cost 1.87888288 | |
[runtime_tensor] [train.py] total_cost 2.87847614 | |
[runtime_tensor] [train.py] total_cost 2.39872694 | |
[runtime_tensor] [train.py] total_cost 1.94121885 | |
[runtime_tensor] [train.py] total_cost 2.03311753 | |
[runtime_tensor] [train.py] total_cost 1.53941274 | |
[runtime_tensor] [train.py] total_cost 1.80498338 | |
[runtime_tensor] [train.py] total_cost 1.41814756 | |
[runtime_tensor] [train.py] total_cost 1.79106951 | |
[runtime_tensor] [train.py] total_cost 1.66424692 | |
[runtime_tensor] [train.py] total_cost 2.55854321 | |
[runtime_tensor] [train.py] total_cost 1.78778386 | |
[runtime_tensor] [train.py] total_cost 1.55733585 | |
[runtime_tensor] [train.py] total_cost 1.83205438 | |
[runtime_tensor] [train.py] total_cost 2.90235877 | |
[runtime_tensor] [train.py] total_cost 3.38499498 | |
[runtime_tensor] [train.py] total_cost 1.79549313 | |
[runtime_tensor] [train.py] total_cost 1.89728928 | |
[runtime_tensor] [train.py] total_cost 1.66043377 | |
[runtime_tensor] [train.py] total_cost 2.03973913 | |
[runtime_tensor] [train.py] total_cost 1.82356048 | |
[runtime_tensor] [train.py] total_cost 3.80715895 | |
[runtime_tensor] [train.py] total_cost 2.65155888 | |
[runtime_tensor] [train.py] total_cost 2.27582145 | |
[runtime_tensor] [train.py] total_cost 1.64307356 | |
[runtime_tensor] [train.py] total_cost 1.54805899 | |
[runtime_tensor] [train.py] total_cost 1.94195819 | |
[runtime_tensor] [train.py] total_cost 1.69536543 | |
[runtime_tensor] [train.py] total_cost 1.79404831 | |
[runtime_tensor] [train.py] total_cost 1.93038428 | |
[runtime_tensor] [train.py] total_cost 1.80141032 | |
[runtime_tensor] [train.py] total_cost 1.6861937 | |
[runtime_tensor] [train.py] total_cost 1.4884491 | |
[runtime_tensor] [train.py] total_cost 1.80019748 | |
[runtime_tensor] [train.py] total_cost 1.54179 | |
[runtime_tensor] [train.py] total_cost 1.35995579 | |
[runtime_tensor] [train.py] total_cost 2.17170811 | |
[runtime_tensor] [train.py] total_cost 1.69693089 | |
[runtime_tensor] [train.py] total_cost 1.82187724 | |
[runtime_tensor] [train.py] total_cost 1.36780667 | |
[runtime_tensor] [train.py] total_cost 1.76488447 | |
[runtime_tensor] [train.py] total_cost 1.70155907 | |
[runtime_tensor] [train.py] total_cost 1.52673888 | |
[runtime_tensor] [train.py] total_cost 1.95993662 | |
[runtime_tensor] [train.py] total_cost 2.06336546 | |
[runtime_tensor] [train.py] total_cost 1.86459255 | |
[runtime_tensor] [train.py] total_cost 2.13582587 | |
[runtime_tensor] [train.py] total_cost 2.02194357 | |
[runtime_tensor] [train.py] total_cost 1.74479055 | |
[runtime_tensor] [train.py] total_cost 2.10706067 | |
[runtime_tensor] [train.py] total_cost 1.59439 | |
[runtime_tensor] [train.py] total_cost 1.64635122 | |
[runtime_tensor] [train.py] total_cost 1.44585669 | |
[runtime_tensor] [train.py] total_cost 1.55375981 | |
[runtime_tensor] [train.py] total_cost 1.55544639 | |
[runtime_tensor] [train.py] total_cost 1.46667445 | |
[runtime_tensor] [train.py] total_cost 3.45109248 | |
[runtime_tensor] [train.py] total_cost 1.70504546 | |
[runtime_tensor] [train.py] total_cost 1.43916738 | |
[runtime_tensor] [train.py] total_cost 3.05307484 | |
[runtime_tensor] [train.py] total_cost 1.79923809 | |
[runtime_tensor] [train.py] total_cost 2.31075931 | |
[runtime_tensor] [train.py] total_cost 3.37721586 | |
[runtime_tensor] [train.py] total_cost 2.4738121 | |
[runtime_tensor] [train.py] total_cost 2.27788806 | |
[runtime_tensor] [train.py] total_cost 1.97916532 | |
[runtime_tensor] [train.py] total_cost 2.04248357 | |
[runtime_tensor] [train.py] total_cost 3.59475327 | |
[runtime_tensor] [train.py] total_cost 1.84412038 | |
[runtime_tensor] [train.py] total_cost 2.67348576 | |
[runtime_tensor] [train.py] total_cost 2.93339229 | |
2019-03-08 19:02:43.530004: E tensorflow/stream_executor/cuda/cuda_blas.cc:694] failed to run cuBLAS routine cublasSgemmEx: CUBLAS_STATUS_EXECUTION_FAILED | |
2%|1 |282/15000[01:38<1:25:20, 2.87it/s]2019-03-08 19:02:43.543948: E tensorflow/stream_executor/cuda/cuda_blas.cc:694] failed to run cuBLAS routine cublasSgemmEx: CUBLAS_STATUS_EXECUTION_FAILED | |
2019-03-08 19:02:43.554374: E tensorflow/stream_executor/cuda/cuda_blas.cc:694] failed to run cuBLAS routine cublasSgemmEx: CUBLAS_STATUS_EXECUTION_FAILED | |
2%|1 |282/15000[01:38<1:25:21, 2.87it/s]2019-03-08 19:02:43.559896: E tensorflow/stream_executor/cuda/cuda_blas.cc:694] failed to run cuBLAS routine cublasSgemmEx: CUBLAS_STATUS_EXECUTION_FAILED | |
2019-03-08 19:02:43.563595: E tensorflow/stream_executor/cuda/cuda_blas.cc:694] failed to run cuBLAS routine cublasSgemmEx: CUBLAS_STATUS_EXECUTION_FAILED | |
2019-03-08 19:02:43.563752: F tensorflow/stream_executor/gpu/gpu_timer.cc:65] Check failed: start_event_ != nullptr && stop_event_ != nullptr | |
[ip-172-31-14-112:20673] *** Process received signal *** | |
[ip-172-31-14-112:20673] Signal: Aborted (6) | |
[ip-172-31-14-112:20673] Signal code: (-6) | |
[ip-172-31-14-112:20673] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x11390)[0x7f6d30a12390] | |
[ip-172-31-14-112:20673] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x38)[0x7f6d3066c428] | |
[ip-172-31-14-112:20673] [ 2] /lib/x86_64-linux-gnu/libc.so.6(abort+0x16a)[0x7f6d3066e02a] | |
[ip-172-31-14-112:20673] [ 3] 2019-03-08 19:02:43.564272: E tensorflow/stream_executor/cuda/cuda_blas.cc:694] failed to run cuBLAS routine cublasSgemmEx: CUBLAS_STATUS_EXECUTION_FAILED | |
2019-03-08 19:02:43.564412: F tensorflow/stream_executor/gpu/gpu_timer.cc:65] Check failed: start_event_ != nullptr && stop_event_ != nullptr | |
[ip-172-31-14-112:20674] *** Process received signal *** | |
[ip-172-31-14-112:20674] Signal: Aborted (6) | |
[ip-172-31-14-112:20674] Signal code: (-6) | |
[ip-172-31-14-112:20674] [ 0] /lib/x86_64-linux-gnu/libpthread.so.0(+0x11390)[0x7f0d7bff5390] | |
[ip-172-31-14-112:20674] [ 1] /lib/x86_64-linux-gnu/libc.so.6(gsignal+0x38)[0x7f0d7bc4f428] | |
[ip-172-31-14-112:20674] [ 2] /lib/x86_64-linux-gnu/libc.so.6(abort+0x16a)[0x7f0d7bc5102a] | |
[ip-172-31-14-112:20674] [ 3] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/_pywrap_tensorflow_internal.so(+0x6c66fb4)[0x7f6d12b85fb4] | |
[ip-172-31-14-112:20673] [ 4] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(_ZNK15stream_executor3gpu8GpuTimer22GetElapsedMillisecondsEv+0x97)[0x7f6d0b98f507] | |
[ip-172-31-14-112:20673] [ 5] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/_pywrap_tensorflow_internal.so(+0x6c66fb4)[0x7f0d5e168fb4] | |
[ip-172-31-14-112:20674] [ 4] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(_ZNK15stream_executor3gpu8GpuTimer12MicrosecondsEv+0x9)[0x7f6d0b8f8959] | |
[ip-172-31-14-112:20673] [ 6] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(_ZNK15stream_executor3gpu8GpuTimer22GetElapsedMillisecondsEv+0x97)[0x7f0d56f72507] | |
[ip-172-31-14-112:20674] [ 5] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(_ZNK15stream_executor3gpu8GpuTimer12MicrosecondsEv+0x9)[0x7f0d56edb959] | |
[ip-172-31-14-112:20674] [ 6] 2019-03-08 19:02:43.566607: F tensorflow/stream_executor/gpu/gpu_timer.cc:65] Check failed: start_event_ != nullptr && stop_event_ != nullptr | |
2%|1 |282/15000[01:38<1:25:16, 2.88it/s]2019-03-08 19:02:43.566757: E tensorflow/stream_executor/cuda/cuda_blas.cc:694] failed to run cuBLAS routine cublasSgemmEx: CUBLAS_STATUS_EXECUTION_FAILED | |
/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/_pywrap_tensorflow_internal.so(_ZN10tensorflow10BiasGradOpIN5Eigen9GpuDeviceENS1_4halfEE7ComputeEPNS_15OpKernelContextE+0x33a)[0x7f6d0fa952da] | |
[ip-172-31-14-112:20673] [ 7] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(_ZN10tensorflow13BaseGPUDevice13ComputeHelperEPNS_8OpKernelEPNS_15OpKernelContextE+0x48a)[0x7f6d0b46aa6a] | |
[ip-172-31-14-112:20673] [ 8] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(_ZN10tensorflow13BaseGPUDevice7ComputeEPNS_8OpKernelEPNS_15OpKernelContextE+0x2a)[0x7f6d0b46b78a] | |
[ip-172-31-14-112:20673] [ 9] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(+0x77dbe0)[0x7f6d0b4c1be0] | |
[ip-172-31-14-112:20673] [10] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(+0x77dc6f)[0x7f6d0b4c1c6f] | |
[ip-172-31-14-112:20673] [11] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(_ZN5Eigen15ThreadPoolTemplIN10tensorflow6thread16EigenEnvironmentEE10WorkerLoopEi+0x2e2)[0x7f6d0b550c72] | |
[ip-172-31-14-112:20673] [12] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/../libtensorflow_framework.so(_ZNSt17_Function_handlerIFvvEZN10tensorflow6thread16EigenEnvironment12CreateThreadESt8functionIS0_EEUlvE_E9_M_invokeERKSt9_Any_data+0x48)[0x7f6d0b54de68] | |
[ip-172-31-14-112:20673] [13] /home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/bin/../lib/libstdc++.so.6(+0xafc5c)[0x7f6d1e74ec5c] | |
[ip-172-31-14-112:20673] [14] /lib/x86_64-linux-gnu/libpthread.so.0(+0x76ba)[0x7f6d30a086ba] | |
[ip-172-31-14-112:20673] [15] /lib/x86_64-linux-gnu/libc.so.6(clone+0x6d)[0x7f6d3073e41d] | |
[ip-172-31-14-112:20673] *** End of error message *** | |
2%|1 |282/15000[01:37<1:25:03, 2.88it/s] | |
2%|1 |282/15000[01:37<1:25:12, 2.88it/s] | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call | |
return fn(*args) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1320, in _run_fn | |
options, feed_dict, fetch_list, target_list, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1408, in _call_tf_sessionrun | |
run_metadata) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[{{node fpn/fpn/upsample_lat3/Tensordot/MatMul}}]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
During handling of the above exception, another exception occurred: | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 94, in launch_train_with_config | |
extra_callbacks=config.extra_callbacks) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 343, in train_with_defaults | |
steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 315, in train | |
self.main_loop(steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 280, in main_loop | |
self.run_step() # implemented by subclass | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 180, in run_step | |
self.hooked_sess.run(self.train_op) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 694, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1189, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1288, in run | |
raise six.reraise(*original_exc_info) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/six.py", line 693, in reraise | |
raise value | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1273, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1345, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1109, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 930, in run | |
run_metadata_ptr) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1153, in _run | |
feed_dict_tensor, options, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1329, in _do_run | |
run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1349, in _do_call | |
raise type(e)(node_def, op, message) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[node fpn/fpn/upsample_lat3/Tensordot/MatMul (defined at /tensorpack-mask-rcnn/tensorpack/models/pool.py:130) ]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
Original stack trace for 'fpn/fpn/upsample_lat3/Tensordot/MatMul': | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 84, in launch_train_with_config | |
model._build_graph_get_cost, model.get_optimizer) | |
File "/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 216, in setup_graph | |
train_callbacks = self._setup_graph(input, get_cost_fn, get_opt_fn) | |
File "/tensorpack-mask-rcnn/tensorpack/train/trainers.py", line 410, in _setup_graph | |
grads = self._make_get_grad_fn(input, get_cost_fn, get_opt_fn)() | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 283, in get_grad_fn | |
return compute_grad_from_inputs(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 247, in compute_grad_from_inputs | |
cost = get_cost_fn(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/tfutils/tower.py", line 286, in __call__ | |
output = self._tower_fn(*args) | |
File "/tensorpack-mask-rcnn/tensorpack/graph_builder/model_desc.py", line 262, in _build_graph_get_cost | |
ret = self.build_graph(*inputs) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 124, in build_graph | |
features = self.backbone(images) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 193, in backbone | |
p23456 = fpn_model('fpn', c2345, fp16=self.fp16) | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 80, in fpn_model | |
lat = lat + upsample2x('upsample_lat{}'.format(6 - idx), lat_sum_5432[-1]) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 57, in upsample2x | |
data_format='channels_first') | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/tensorpack/models/pool.py", line 130, in FixedUnPooling | |
ret = tf.tensordot(x, mat, axes=1) # bxcxhxwxshxsw | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 3641, in tensordot | |
ab_matmul = matmul(a_reshape, b_reshape) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 2513, in matmul | |
a, b, transpose_a=transpose_a, transpose_b=transpose_b, name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gen_math_ops.py", line 5675, in mat_mul | |
name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 800, in _apply_op_helper | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func | |
return func(*args, **kwargs) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3473, in create_op | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1961, in __init__ | |
self._traceback = tf_stack.extract_stack() | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call | |
return fn(*args) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1320, in _run_fn | |
options, feed_dict, fetch_list, target_list, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1408, in _call_tf_sessionrun | |
run_metadata) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[{{node fpn/fpn/upsample_lat3/Tensordot/MatMul}}]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
During handling of the above exception, another exception occurred: | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 94, in launch_train_with_config | |
extra_callbacks=config.extra_callbacks) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 343, in train_with_defaults | |
steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 315, in train | |
self.main_loop(steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 280, in main_loop | |
self.run_step() # implemented by subclass | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 180, in run_step | |
self.hooked_sess.run(self.train_op) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 694, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1189, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1288, in run | |
raise six.reraise(*original_exc_info) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/six.py", line 693, in reraise | |
raise value | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1273, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1345, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1109, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 930, in run | |
run_metadata_ptr) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1153, in _run | |
feed_dict_tensor, options, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1329, in _do_run | |
run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1349, in _do_call | |
raise type(e)(node_def, op, message) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[node fpn/fpn/upsample_lat3/Tensordot/MatMul (defined at /tensorpack-mask-rcnn/tensorpack/models/pool.py:130) ]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
Original stack trace for 'fpn/fpn/upsample_lat3/Tensordot/MatMul': | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 84, in launch_train_with_config | |
model._build_graph_get_cost, model.get_optimizer) | |
File "/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 216, in setup_graph | |
train_callbacks = self._setup_graph(input, get_cost_fn, get_opt_fn) | |
File "/tensorpack-mask-rcnn/tensorpack/train/trainers.py", line 410, in _setup_graph | |
grads = self._make_get_grad_fn(input, get_cost_fn, get_opt_fn)() | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 283, in get_grad_fn | |
return compute_grad_from_inputs(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 247, in compute_grad_from_inputs | |
cost = get_cost_fn(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/tfutils/tower.py", line 286, in __call__ | |
output = self._tower_fn(*args) | |
File "/tensorpack-mask-rcnn/tensorpack/graph_builder/model_desc.py", line 262, in _build_graph_get_cost | |
ret = self.build_graph(*inputs) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 124, in build_graph | |
features = self.backbone(images) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 193, in backbone | |
p23456 = fpn_model('fpn', c2345, fp16=self.fp16) | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 80, in fpn_model | |
lat = lat + upsample2x('upsample_lat{}'.format(6 - idx), lat_sum_5432[-1]) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 57, in upsample2x | |
data_format='channels_first') | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/tensorpack/models/pool.py", line 130, in FixedUnPooling | |
ret = tf.tensordot(x, mat, axes=1) # bxcxhxwxshxsw | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 3641, in tensordot | |
ab_matmul = matmul(a_reshape, b_reshape) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 2513, in matmul | |
a, b, transpose_a=transpose_a, transpose_b=transpose_b, name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gen_math_ops.py", line 5675, in mat_mul | |
name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 800, in _apply_op_helper | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func | |
return func(*args, **kwargs) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3473, in create_op | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1961, in __init__ | |
self._traceback = tf_stack.extract_stack() | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call | |
return fn(*args) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1320, in _run_fn | |
options, feed_dict, fetch_list, target_list, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1408, in _call_tf_sessionrun | |
run_metadata) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[{{node fpn/fpn/upsample_lat3/Tensordot/MatMul}}]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
During handling of the above exception, another exception occurred: | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 94, in launch_train_with_config | |
extra_callbacks=config.extra_callbacks) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 343, in train_with_defaults | |
steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 315, in train | |
self.main_loop(steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 280, in main_loop | |
self.run_step() # implemented by subclass | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 180, in run_step | |
self.hooked_sess.run(self.train_op) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 694, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1189, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1288, in run | |
raise six.reraise(*original_exc_info) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/six.py", line 693, in reraise | |
raise value | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1273, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1345, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1109, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 930, in run | |
run_metadata_ptr) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1153, in _run | |
feed_dict_tensor, options, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1329, in _do_run | |
run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1349, in _do_call | |
raise type(e)(node_def, op, message) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[node fpn/fpn/upsample_lat3/Tensordot/MatMul (defined at /tensorpack-mask-rcnn/tensorpack/models/pool.py:130) ]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
Original stack trace for 'fpn/fpn/upsample_lat3/Tensordot/MatMul': | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 84, in launch_train_with_config | |
model._build_graph_get_cost, model.get_optimizer) | |
File "/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 216, in setup_graph | |
train_callbacks = self._setup_graph(input, get_cost_fn, get_opt_fn) | |
File "/tensorpack-mask-rcnn/tensorpack/train/trainers.py", line 410, in _setup_graph | |
grads = self._make_get_grad_fn(input, get_cost_fn, get_opt_fn)() | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 283, in get_grad_fn | |
return compute_grad_from_inputs(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 247, in compute_grad_from_inputs | |
cost = get_cost_fn(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/tfutils/tower.py", line 286, in __call__ | |
output = self._tower_fn(*args) | |
File "/tensorpack-mask-rcnn/tensorpack/graph_builder/model_desc.py", line 262, in _build_graph_get_cost | |
ret = self.build_graph(*inputs) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 124, in build_graph | |
features = self.backbone(images) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 193, in backbone | |
p23456 = fpn_model('fpn', c2345, fp16=self.fp16) | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 80, in fpn_model | |
lat = lat + upsample2x('upsample_lat{}'.format(6 - idx), lat_sum_5432[-1]) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 57, in upsample2x | |
data_format='channels_first') | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/tensorpack/models/pool.py", line 130, in FixedUnPooling | |
ret = tf.tensordot(x, mat, axes=1) # bxcxhxwxshxsw | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 3641, in tensordot | |
ab_matmul = matmul(a_reshape, b_reshape) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 2513, in matmul | |
a, b, transpose_a=transpose_a, transpose_b=transpose_b, name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gen_math_ops.py", line 5675, in mat_mul | |
name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 800, in _apply_op_helper | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func | |
return func(*args, **kwargs) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3473, in create_op | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1961, in __init__ | |
self._traceback = tf_stack.extract_stack() | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call | |
return fn(*args) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1320, in _run_fn | |
options, feed_dict, fetch_list, target_list, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1408, in _call_tf_sessionrun | |
run_metadata) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[{{node fpn/fpn/upsample_lat3/Tensordot/MatMul}}]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
During handling of the above exception, another exception occurred: | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 94, in launch_train_with_config | |
extra_callbacks=config.extra_callbacks) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 343, in train_with_defaults | |
steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 315, in train | |
self.main_loop(steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 280, in main_loop | |
self.run_step() # implemented by subclass | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 180, in run_step | |
self.hooked_sess.run(self.train_op) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 694, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1189, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1288, in run | |
raise six.reraise(*original_exc_info) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/six.py", line 693, in reraise | |
raise value | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1273, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1345, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1109, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 930, in run | |
run_metadata_ptr) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1153, in _run | |
feed_dict_tensor, options, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1329, in _do_run | |
run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1349, in _do_call | |
raise type(e)(node_def, op, message) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[node fpn/fpn/upsample_lat3/Tensordot/MatMul (defined at /tensorpack-mask-rcnn/tensorpack/models/pool.py:130) ]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
Original stack trace for 'fpn/fpn/upsample_lat3/Tensordot/MatMul': | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 84, in launch_train_with_config | |
model._build_graph_get_cost, model.get_optimizer) | |
File "/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 216, in setup_graph | |
train_callbacks = self._setup_graph(input, get_cost_fn, get_opt_fn) | |
File "/tensorpack-mask-rcnn/tensorpack/train/trainers.py", line 410, in _setup_graph | |
grads = self._make_get_grad_fn(input, get_cost_fn, get_opt_fn)() | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 283, in get_grad_fn | |
return compute_grad_from_inputs(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 247, in compute_grad_from_inputs | |
cost = get_cost_fn(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/tfutils/tower.py", line 286, in __call__ | |
output = self._tower_fn(*args) | |
File "/tensorpack-mask-rcnn/tensorpack/graph_builder/model_desc.py", line 262, in _build_graph_get_cost | |
ret = self.build_graph(*inputs) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 124, in build_graph | |
features = self.backbone(images) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 193, in backbone | |
p23456 = fpn_model('fpn', c2345, fp16=self.fp16) | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 80, in fpn_model | |
lat = lat + upsample2x('upsample_lat{}'.format(6 - idx), lat_sum_5432[-1]) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 57, in upsample2x | |
data_format='channels_first') | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/tensorpack/models/pool.py", line 130, in FixedUnPooling | |
ret = tf.tensordot(x, mat, axes=1) # bxcxhxwxshxsw | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 3641, in tensordot | |
ab_matmul = matmul(a_reshape, b_reshape) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 2513, in matmul | |
a, b, transpose_a=transpose_a, transpose_b=transpose_b, name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gen_math_ops.py", line 5675, in mat_mul | |
name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 800, in _apply_op_helper | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func | |
return func(*args, **kwargs) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3473, in create_op | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1961, in __init__ | |
self._traceback = tf_stack.extract_stack() | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call | |
return fn(*args) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1320, in _run_fn | |
options, feed_dict, fetch_list, target_list, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1408, in _call_tf_sessionrun | |
run_metadata) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[{{node fpn/fpn/upsample_lat3/Tensordot/MatMul}}]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
During handling of the above exception, another exception occurred: | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 94, in launch_train_with_config | |
extra_callbacks=config.extra_callbacks) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 343, in train_with_defaults | |
steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 315, in train | |
self.main_loop(steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 280, in main_loop | |
self.run_step() # implemented by subclass | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 180, in run_step | |
self.hooked_sess.run(self.train_op) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 694, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1189, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1288, in run | |
raise six.reraise(*original_exc_info) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/six.py", line 693, in reraise | |
raise value | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1273, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1345, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1109, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 930, in run | |
run_metadata_ptr) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1153, in _run | |
feed_dict_tensor, options, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1329, in _do_run | |
run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1349, in _do_call | |
raise type(e)(node_def, op, message) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[node fpn/fpn/upsample_lat3/Tensordot/MatMul (defined at /tensorpack-mask-rcnn/tensorpack/models/pool.py:130) ]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_4596]] | |
Original stack trace for 'fpn/fpn/upsample_lat3/Tensordot/MatMul': | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 84, in launch_train_with_config | |
model._build_graph_get_cost, model.get_optimizer) | |
File "/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 216, in setup_graph | |
train_callbacks = self._setup_graph(input, get_cost_fn, get_opt_fn) | |
File "/tensorpack-mask-rcnn/tensorpack/train/trainers.py", line 410, in _setup_graph | |
grads = self._make_get_grad_fn(input, get_cost_fn, get_opt_fn)() | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 283, in get_grad_fn | |
return compute_grad_from_inputs(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 247, in compute_grad_from_inputs | |
cost = get_cost_fn(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/tfutils/tower.py", line 286, in __call__ | |
output = self._tower_fn(*args) | |
File "/tensorpack-mask-rcnn/tensorpack/graph_builder/model_desc.py", line 262, in _build_graph_get_cost | |
ret = self.build_graph(*inputs) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 124, in build_graph | |
features = self.backbone(images) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 193, in backbone | |
p23456 = fpn_model('fpn', c2345, fp16=self.fp16) | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 80, in fpn_model | |
lat = lat + upsample2x('upsample_lat{}'.format(6 - idx), lat_sum_5432[-1]) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 57, in upsample2x | |
data_format='channels_first') | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/tensorpack/models/pool.py", line 130, in FixedUnPooling | |
ret = tf.tensordot(x, mat, axes=1) # bxcxhxwxshxsw | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 3641, in tensordot | |
ab_matmul = matmul(a_reshape, b_reshape) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 2513, in matmul | |
a, b, transpose_a=transpose_a, transpose_b=transpose_b, name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gen_math_ops.py", line 5675, in mat_mul | |
name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 800, in _apply_op_helper | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func | |
return func(*args, **kwargs) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3473, in create_op | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1961, in __init__ | |
self._traceback = tf_stack.extract_stack() | |
2019-03-08 19:02:43.688498: E tensorflow/stream_executor/cuda/cuda_blas.cc:694] failed to run cuBLAS routine cublasSgemmEx: CUBLAS_STATUS_EXECUTION_FAILED | |
2%|1 |282/15000[01:35<1:23:19, 2.94it/s] | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1335, in _do_call | |
return fn(*args) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1320, in _run_fn | |
options, feed_dict, fetch_list, target_list, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1408, in _call_tf_sessionrun | |
run_metadata) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[{{node fpn/fpn/upsample_lat3/Tensordot/MatMul}}]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_5236]] | |
During handling of the above exception, another exception occurred: | |
Traceback (most recent call last): | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/home/ubuntu/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 94, in launch_train_with_config | |
extra_callbacks=config.extra_callbacks) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 343, in train_with_defaults | |
steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 315, in train | |
self.main_loop(steps_per_epoch, starting_epoch, max_epoch) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 280, in main_loop | |
self.run_step() # implemented by subclass | |
File "/home/ubuntu/tensorpack-mask-rcnn/tensorpack/train/base.py", line 180, in run_step | |
self.hooked_sess.run(self.train_op) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 694, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1189, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1288, in run | |
raise six.reraise(*original_exc_info) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/six.py", line 693, in reraise | |
raise value | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1273, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1345, in run | |
run_metadata=run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/training/monitored_session.py", line 1109, in run | |
return self._sess.run(*args, **kwargs) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 930, in run | |
run_metadata_ptr) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1153, in _run | |
feed_dict_tensor, options, run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1329, in _do_run | |
run_metadata) | |
File "/home/ubuntu/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/client/session.py", line 1349, in _do_call | |
raise type(e)(node_def, op, message) | |
tensorflow.python.framework.errors_impl.InternalError: Blas GEMM launch failed : a.shape=(8601600, 1), b.shape=(1, 4), m=8601600, n=4, k=1 | |
[[node fpn/fpn/upsample_lat3/Tensordot/MatMul (defined at /tensorpack-mask-rcnn/tensorpack/models/pool.py:130) ]] | |
[[gradients/rpn/rpn/box/Conv2D_grad/ShapeN/_5236]] | |
Original stack trace for 'fpn/fpn/upsample_lat3/Tensordot/MatMul': | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 193, in _run_module_as_main | |
"__main__", mod_spec) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/runpy.py", line 85, in _run_code | |
exec(code, run_globals) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 651, in <module> | |
launch_train_with_config(traincfg, trainer) | |
File "/tensorpack-mask-rcnn/tensorpack/train/interface.py", line 84, in launch_train_with_config | |
model._build_graph_get_cost, model.get_optimizer) | |
File "/tensorpack-mask-rcnn/tensorpack/utils/argtools.py", line 176, in wrapper | |
return func(*args, **kwargs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 216, in setup_graph | |
train_callbacks = self._setup_graph(input, get_cost_fn, get_opt_fn) | |
File "/tensorpack-mask-rcnn/tensorpack/train/trainers.py", line 410, in _setup_graph | |
grads = self._make_get_grad_fn(input, get_cost_fn, get_opt_fn)() | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 283, in get_grad_fn | |
return compute_grad_from_inputs(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/train/tower.py", line 247, in compute_grad_from_inputs | |
cost = get_cost_fn(*inputs) | |
File "/tensorpack-mask-rcnn/tensorpack/tfutils/tower.py", line 286, in __call__ | |
output = self._tower_fn(*args) | |
File "/tensorpack-mask-rcnn/tensorpack/graph_builder/model_desc.py", line 262, in _build_graph_get_cost | |
ret = self.build_graph(*inputs) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 124, in build_graph | |
features = self.backbone(images) | |
File "/tensorpack-mask-rcnn/MaskRCNN/train.py", line 193, in backbone | |
p23456 = fpn_model('fpn', c2345, fp16=self.fp16) | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 80, in fpn_model | |
lat = lat + upsample2x('upsample_lat{}'.format(6 - idx), lat_sum_5432[-1]) | |
File "/tensorpack-mask-rcnn/MaskRCNN/model_fpn.py", line 57, in upsample2x | |
data_format='channels_first') | |
File "/tensorpack-mask-rcnn/tensorpack/models/registry.py", line 128, in wrapped_func | |
outputs = func(*args, **actual_args) | |
File "/tensorpack-mask-rcnn/tensorpack/models/pool.py", line 130, in FixedUnPooling | |
ret = tf.tensordot(x, mat, axes=1) # bxcxhxwxshxsw | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 3641, in tensordot | |
ab_matmul = matmul(a_reshape, b_reshape) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/math_ops.py", line 2513, in matmul | |
a, b, transpose_a=transpose_a, transpose_b=transpose_b, name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/ops/gen_math_ops.py", line 5675, in mat_mul | |
name=name) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/op_def_library.py", line 800, in _apply_op_helper | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/util/deprecation.py", line 507, in new_func | |
return func(*args, **kwargs) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 3473, in create_op | |
op_def=op_def) | |
File "/anaconda3/envs/tensorflow_p36_13rc1/lib/python3.6/site-packages/tensorflow/python/framework/ops.py", line 1961, in __init__ | |
self._traceback = tf_stack.extract_stack() | |
------------------------------------------------------- | |
Primary job terminated normally, but 1 process returned | |
a non-zero exit code. Per user-direction, the job has been aborted. | |
------------------------------------------------------- | |
-------------------------------------------------------------------------- | |
mpirun noticed that process rank 4 with PID 0 on node ip-172-31-14-112 exited on signal 6 (Aborted). | |
-------------------------------------------------------------------------- |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment